10:56:46 Triggered by Gerrit: https://git.opendaylight.org/gerrit/c/transportpce/+/114149 10:56:46 Running as SYSTEM 10:56:46 [EnvInject] - Loading node environment variables. 10:56:46 Building remotely on prd-ubuntu2004-docker-4c-16g-2816 (ubuntu2004-docker-4c-16g) in workspace /w/workspace/transportpce-tox-verify-scandium 10:56:46 [ssh-agent] Looking for ssh-agent implementation... 10:56:46 [ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine) 10:56:46 $ ssh-agent 10:56:46 SSH_AUTH_SOCK=/tmp/ssh-XifRBmbMEA0P/agent.12208 10:56:46 SSH_AGENT_PID=12211 10:56:46 [ssh-agent] Started. 10:56:46 Running ssh-add (command line suppressed) 10:56:47 Identity added: /w/workspace/transportpce-tox-verify-scandium@tmp/private_key_14482163417575738728.key (/w/workspace/transportpce-tox-verify-scandium@tmp/private_key_14482163417575738728.key) 10:56:47 [ssh-agent] Using credentials jenkins (jenkins-ssh) 10:56:47 The recommended git tool is: NONE 10:56:49 using credential jenkins-ssh 10:56:49 Wiping out workspace first. 10:56:49 Cloning the remote Git repository 10:56:49 Cloning repository git://devvexx.opendaylight.org/mirror/transportpce 10:56:49 > git init /w/workspace/transportpce-tox-verify-scandium # timeout=10 10:56:49 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/transportpce 10:56:49 > git --version # timeout=10 10:56:49 > git --version # 'git version 2.25.1' 10:56:49 using GIT_SSH to set credentials jenkins-ssh 10:56:49 Verifying host key using known hosts file 10:56:49 You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. 10:56:49 > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/transportpce +refs/heads/*:refs/remotes/origin/* # timeout=10 10:56:52 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/transportpce # timeout=10 10:56:52 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 10:56:53 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/transportpce # timeout=10 10:56:53 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/transportpce 10:56:53 using GIT_SSH to set credentials jenkins-ssh 10:56:53 Verifying host key using known hosts file 10:56:53 You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. 10:56:53 > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/transportpce refs/changes/49/114149/2 # timeout=10 10:56:53 > git rev-parse 345041be3ae176e94cd9d6aa38fae3eca43ca099^{commit} # timeout=10 10:56:53 Checking out Revision 345041be3ae176e94cd9d6aa38fae3eca43ca099 (refs/changes/49/114149/2) 10:56:53 > git config core.sparsecheckout # timeout=10 10:56:53 > git checkout -f 345041be3ae176e94cd9d6aa38fae3eca43ca099 # timeout=10 10:56:56 Commit message: "Fix grid spectrum computation (reversed logic)" 10:56:56 > git rev-parse FETCH_HEAD^{commit} # timeout=10 10:56:56 > git rev-list --no-walk a4ed7855478d5f1f47f5f7d32f13589a7683f4e7 # timeout=10 10:56:56 > git remote # timeout=10 10:56:56 > git submodule init # timeout=10 10:56:56 > git submodule sync # timeout=10 10:56:56 > git config --get remote.origin.url # timeout=10 10:56:56 > git submodule init # timeout=10 10:56:56 > git config -f .gitmodules --get-regexp ^submodule\.(.+)\.url # timeout=10 10:56:56 ERROR: No submodules found. 10:56:57 provisioning config files... 10:56:57 copy managed file [npmrc] to file:/home/jenkins/.npmrc 10:56:57 copy managed file [pipconf] to file:/home/jenkins/.config/pip/pip.conf 10:56:57 [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins13513949192184487658.sh 10:56:57 ---> python-tools-install.sh 10:56:57 Setup pyenv: 10:56:57 * system (set by /opt/pyenv/version) 10:56:57 * 3.8.13 (set by /opt/pyenv/version) 10:56:57 * 3.9.13 (set by /opt/pyenv/version) 10:56:57 * 3.10.13 (set by /opt/pyenv/version) 10:56:57 * 3.11.7 (set by /opt/pyenv/version) 10:57:01 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-KOUk 10:57:01 lf-activate-venv(): INFO: Save venv in file: /tmp/.os_lf_venv 10:57:04 lf-activate-venv(): INFO: Installing: lftools 10:57:33 lf-activate-venv(): INFO: Adding /tmp/venv-KOUk/bin to PATH 10:57:33 Generating Requirements File 10:57:53 Python 3.11.7 10:57:53 pip 24.3.1 from /tmp/venv-KOUk/lib/python3.11/site-packages/pip (python 3.11) 10:57:53 appdirs==1.4.4 10:57:53 argcomplete==3.5.1 10:57:53 aspy.yaml==1.3.0 10:57:53 attrs==24.2.0 10:57:53 autopage==0.5.2 10:57:53 beautifulsoup4==4.12.3 10:57:53 boto3==1.35.51 10:57:53 botocore==1.35.51 10:57:53 bs4==0.0.2 10:57:53 cachetools==5.5.0 10:57:53 certifi==2024.8.30 10:57:53 cffi==1.17.1 10:57:53 cfgv==3.4.0 10:57:53 chardet==5.2.0 10:57:53 charset-normalizer==3.4.0 10:57:53 click==8.1.7 10:57:53 cliff==4.7.0 10:57:53 cmd2==2.5.0 10:57:53 cryptography==3.3.2 10:57:53 debtcollector==3.0.0 10:57:53 decorator==5.1.1 10:57:53 defusedxml==0.7.1 10:57:53 Deprecated==1.2.14 10:57:53 distlib==0.3.9 10:57:53 dnspython==2.7.0 10:57:53 docker==4.2.2 10:57:53 dogpile.cache==1.3.3 10:57:53 durationpy==0.9 10:57:53 email_validator==2.2.0 10:57:53 filelock==3.16.1 10:57:53 future==1.0.0 10:57:53 gitdb==4.0.11 10:57:53 GitPython==3.1.43 10:57:53 google-auth==2.35.0 10:57:53 httplib2==0.22.0 10:57:53 identify==2.6.1 10:57:53 idna==3.10 10:57:53 importlib-resources==1.5.0 10:57:53 iso8601==2.1.0 10:57:53 Jinja2==3.1.4 10:57:53 jmespath==1.0.1 10:57:53 jsonpatch==1.33 10:57:53 jsonpointer==3.0.0 10:57:53 jsonschema==4.23.0 10:57:53 jsonschema-specifications==2024.10.1 10:57:53 keystoneauth1==5.8.0 10:57:53 kubernetes==31.0.0 10:57:53 lftools==0.37.10 10:57:53 lxml==5.3.0 10:57:53 MarkupSafe==3.0.2 10:57:53 msgpack==1.1.0 10:57:53 multi_key_dict==2.0.3 10:57:53 munch==4.0.0 10:57:53 netaddr==1.3.0 10:57:53 netifaces==0.11.0 10:57:53 niet==1.4.2 10:57:53 nodeenv==1.9.1 10:57:53 oauth2client==4.1.3 10:57:53 oauthlib==3.2.2 10:57:53 openstacksdk==4.1.0 10:57:53 os-client-config==2.1.0 10:57:53 os-service-types==1.7.0 10:57:53 osc-lib==3.1.0 10:57:53 oslo.config==9.6.0 10:57:53 oslo.context==5.6.0 10:57:53 oslo.i18n==6.4.0 10:57:53 oslo.log==6.1.2 10:57:53 oslo.serialization==5.5.0 10:57:53 oslo.utils==7.3.0 10:57:53 packaging==24.1 10:57:53 pbr==6.1.0 10:57:53 platformdirs==4.3.6 10:57:53 prettytable==3.11.0 10:57:53 pyasn1==0.6.1 10:57:53 pyasn1_modules==0.4.1 10:57:53 pycparser==2.22 10:57:53 pygerrit2==2.0.15 10:57:53 PyGithub==2.4.0 10:57:53 PyJWT==2.9.0 10:57:53 PyNaCl==1.5.0 10:57:53 pyparsing==2.4.7 10:57:53 pyperclip==1.9.0 10:57:53 pyrsistent==0.20.0 10:57:53 python-cinderclient==9.6.0 10:57:53 python-dateutil==2.9.0.post0 10:57:53 python-heatclient==4.0.0 10:57:53 python-jenkins==1.8.2 10:57:53 python-keystoneclient==5.5.0 10:57:53 python-magnumclient==4.7.0 10:57:53 python-openstackclient==7.2.1 10:57:53 python-swiftclient==4.6.0 10:57:53 PyYAML==6.0.2 10:57:53 referencing==0.35.1 10:57:53 requests==2.32.3 10:57:53 requests-oauthlib==2.0.0 10:57:53 requestsexceptions==1.4.0 10:57:53 rfc3986==2.0.0 10:57:53 rpds-py==0.20.0 10:57:53 rsa==4.9 10:57:53 ruamel.yaml==0.18.6 10:57:53 ruamel.yaml.clib==0.2.12 10:57:53 s3transfer==0.10.3 10:57:53 simplejson==3.19.3 10:57:53 six==1.16.0 10:57:53 smmap==5.0.1 10:57:53 soupsieve==2.6 10:57:53 stevedore==5.3.0 10:57:53 tabulate==0.9.0 10:57:53 toml==0.10.2 10:57:53 tomlkit==0.13.2 10:57:53 tqdm==4.66.6 10:57:53 typing_extensions==4.12.2 10:57:53 tzdata==2024.2 10:57:53 urllib3==1.26.20 10:57:53 virtualenv==20.27.1 10:57:53 wcwidth==0.2.13 10:57:53 websocket-client==1.8.0 10:57:53 wrapt==1.16.0 10:57:53 xdg==6.0.0 10:57:53 xmltodict==0.14.2 10:57:53 yq==3.4.3 10:57:53 [EnvInject] - Injecting environment variables from a build step. 10:57:53 [EnvInject] - Injecting as environment variables the properties content 10:57:53 PYTHON=python3 10:57:53 10:57:53 [EnvInject] - Variables injected successfully. 10:57:53 [transportpce-tox-verify-scandium] $ /bin/bash -l /tmp/jenkins6855179762805164317.sh 10:57:53 ---> tox-install.sh 10:57:53 + source /home/jenkins/lf-env.sh 10:57:53 + lf-activate-venv --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 10:57:53 ++ mktemp -d /tmp/venv-XXXX 10:57:53 + lf_venv=/tmp/venv-s9J1 10:57:53 + local venv_file=/tmp/.os_lf_venv 10:57:53 + local python=python3 10:57:53 + local options 10:57:53 + local set_path=true 10:57:53 + local install_args= 10:57:53 ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 10:57:53 + options=' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' 10:57:53 + eval set -- ' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' 10:57:53 ++ set -- --venv-file /tmp/.toxenv -- tox virtualenv urllib3~=1.26.15 10:57:53 + true 10:57:53 + case $1 in 10:57:53 + venv_file=/tmp/.toxenv 10:57:53 + shift 2 10:57:53 + true 10:57:53 + case $1 in 10:57:53 + shift 10:57:53 + break 10:57:53 + case $python in 10:57:53 + local pkg_list= 10:57:53 + [[ -d /opt/pyenv ]] 10:57:53 + echo 'Setup pyenv:' 10:57:53 Setup pyenv: 10:57:53 + export PYENV_ROOT=/opt/pyenv 10:57:53 + PYENV_ROOT=/opt/pyenv 10:57:53 + export PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 10:57:53 + PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 10:57:53 + pyenv versions 10:57:53 system 10:57:53 3.8.13 10:57:53 3.9.13 10:57:53 3.10.13 10:57:54 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-scandium/.python-version) 10:57:54 + command -v pyenv 10:57:54 ++ pyenv init - --no-rehash 10:57:54 + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); 10:57:54 for i in ${!paths[@]}; do 10:57:54 if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; 10:57:54 fi; done; 10:57:54 echo "${paths[*]}"'\'')" 10:57:54 export PATH="/opt/pyenv/shims:${PATH}" 10:57:54 export PYENV_SHELL=bash 10:57:54 source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' 10:57:54 pyenv() { 10:57:54 local command 10:57:54 command="${1:-}" 10:57:54 if [ "$#" -gt 0 ]; then 10:57:54 shift 10:57:54 fi 10:57:54 10:57:54 case "$command" in 10:57:54 rehash|shell) 10:57:54 eval "$(pyenv "sh-$command" "$@")" 10:57:54 ;; 10:57:54 *) 10:57:54 command pyenv "$command" "$@" 10:57:54 ;; 10:57:54 esac 10:57:54 }' 10:57:54 +++ bash --norc -ec 'IFS=:; paths=($PATH); 10:57:54 for i in ${!paths[@]}; do 10:57:54 if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; 10:57:54 fi; done; 10:57:54 echo "${paths[*]}"' 10:57:54 ++ PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 10:57:54 ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 10:57:54 ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 10:57:54 ++ export PYENV_SHELL=bash 10:57:54 ++ PYENV_SHELL=bash 10:57:54 ++ source /opt/pyenv/libexec/../completions/pyenv.bash 10:57:54 +++ complete -F _pyenv pyenv 10:57:54 ++ lf-pyver python3 10:57:54 ++ local py_version_xy=python3 10:57:54 ++ local py_version_xyz= 10:57:54 ++ pyenv versions 10:57:54 ++ local command 10:57:54 ++ command=versions 10:57:54 ++ '[' 1 -gt 0 ']' 10:57:54 ++ shift 10:57:54 ++ case "$command" in 10:57:54 ++ command pyenv versions 10:57:54 ++ pyenv versions 10:57:54 ++ grep -E '^[0-9.]*[0-9]$' 10:57:54 ++ awk '{ print $1 }' 10:57:54 ++ sed 's/^[ *]* //' 10:57:54 ++ [[ ! -s /tmp/.pyenv_versions ]] 10:57:54 +++ grep '^3' /tmp/.pyenv_versions 10:57:54 +++ sort -V 10:57:54 +++ tail -n 1 10:57:54 ++ py_version_xyz=3.11.7 10:57:54 ++ [[ -z 3.11.7 ]] 10:57:54 ++ echo 3.11.7 10:57:54 ++ return 0 10:57:54 + pyenv local 3.11.7 10:57:54 + local command 10:57:54 + command=local 10:57:54 + '[' 2 -gt 0 ']' 10:57:54 + shift 10:57:54 + case "$command" in 10:57:54 + command pyenv local 3.11.7 10:57:54 + pyenv local 3.11.7 10:57:54 + for arg in "$@" 10:57:54 + case $arg in 10:57:54 + pkg_list+='tox ' 10:57:54 + for arg in "$@" 10:57:54 + case $arg in 10:57:54 + pkg_list+='virtualenv ' 10:57:54 + for arg in "$@" 10:57:54 + case $arg in 10:57:54 + pkg_list+='urllib3~=1.26.15 ' 10:57:54 + [[ -f /tmp/.toxenv ]] 10:57:54 + [[ ! -f /tmp/.toxenv ]] 10:57:54 + [[ -n '' ]] 10:57:54 + python3 -m venv /tmp/venv-s9J1 10:57:58 + echo 'lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-s9J1' 10:57:58 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-s9J1 10:57:58 + echo /tmp/venv-s9J1 10:57:58 + echo 'lf-activate-venv(): INFO: Save venv in file: /tmp/.toxenv' 10:57:58 lf-activate-venv(): INFO: Save venv in file: /tmp/.toxenv 10:57:58 + /tmp/venv-s9J1/bin/python3 -m pip install --upgrade --quiet pip virtualenv 10:58:01 + [[ -z tox virtualenv urllib3~=1.26.15 ]] 10:58:01 + echo 'lf-activate-venv(): INFO: Installing: tox virtualenv urllib3~=1.26.15 ' 10:58:01 lf-activate-venv(): INFO: Installing: tox virtualenv urllib3~=1.26.15 10:58:01 + /tmp/venv-s9J1/bin/python3 -m pip install --upgrade --quiet --upgrade-strategy eager tox virtualenv urllib3~=1.26.15 10:58:03 + type python3 10:58:03 + true 10:58:03 + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-s9J1/bin to PATH' 10:58:03 lf-activate-venv(): INFO: Adding /tmp/venv-s9J1/bin to PATH 10:58:03 + PATH=/tmp/venv-s9J1/bin:/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 10:58:03 + return 0 10:58:03 + python3 --version 10:58:03 Python 3.11.7 10:58:03 + python3 -m pip --version 10:58:03 pip 24.3.1 from /tmp/venv-s9J1/lib/python3.11/site-packages/pip (python 3.11) 10:58:03 + python3 -m pip freeze 10:58:04 cachetools==5.5.0 10:58:04 chardet==5.2.0 10:58:04 colorama==0.4.6 10:58:04 distlib==0.3.9 10:58:04 filelock==3.16.1 10:58:04 packaging==24.1 10:58:04 platformdirs==4.3.6 10:58:04 pluggy==1.5.0 10:58:04 pyproject-api==1.8.0 10:58:04 tox==4.23.2 10:58:04 urllib3==1.26.20 10:58:04 virtualenv==20.27.1 10:58:04 [transportpce-tox-verify-scandium] $ /bin/sh -xe /tmp/jenkins4526113449502990317.sh 10:58:04 [EnvInject] - Injecting environment variables from a build step. 10:58:04 [EnvInject] - Injecting as environment variables the properties content 10:58:04 PARALLEL=True 10:58:04 10:58:04 [EnvInject] - Variables injected successfully. 10:58:04 [transportpce-tox-verify-scandium] $ /bin/bash -l /tmp/jenkins13820452300239709846.sh 10:58:04 ---> tox-run.sh 10:58:04 + PATH=/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 10:58:04 + ARCHIVE_TOX_DIR=/w/workspace/transportpce-tox-verify-scandium/archives/tox 10:58:04 + ARCHIVE_DOC_DIR=/w/workspace/transportpce-tox-verify-scandium/archives/docs 10:58:04 + mkdir -p /w/workspace/transportpce-tox-verify-scandium/archives/tox 10:58:04 + cd /w/workspace/transportpce-tox-verify-scandium/. 10:58:04 + source /home/jenkins/lf-env.sh 10:58:04 + lf-activate-venv --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 10:58:04 ++ mktemp -d /tmp/venv-XXXX 10:58:04 + lf_venv=/tmp/venv-wuWZ 10:58:04 + local venv_file=/tmp/.os_lf_venv 10:58:04 + local python=python3 10:58:04 + local options 10:58:04 + local set_path=true 10:58:04 + local install_args= 10:58:04 ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 10:58:04 + options=' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' 10:58:04 + eval set -- ' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' 10:58:04 ++ set -- --venv-file /tmp/.toxenv -- tox virtualenv urllib3~=1.26.15 10:58:04 + true 10:58:04 + case $1 in 10:58:04 + venv_file=/tmp/.toxenv 10:58:04 + shift 2 10:58:04 + true 10:58:04 + case $1 in 10:58:04 + shift 10:58:04 + break 10:58:04 + case $python in 10:58:04 + local pkg_list= 10:58:04 + [[ -d /opt/pyenv ]] 10:58:04 + echo 'Setup pyenv:' 10:58:04 Setup pyenv: 10:58:04 + export PYENV_ROOT=/opt/pyenv 10:58:04 + PYENV_ROOT=/opt/pyenv 10:58:04 + export PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 10:58:04 + PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 10:58:04 + pyenv versions 10:58:04 system 10:58:04 3.8.13 10:58:04 3.9.13 10:58:04 3.10.13 10:58:04 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-scandium/.python-version) 10:58:04 + command -v pyenv 10:58:04 ++ pyenv init - --no-rehash 10:58:04 + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); 10:58:04 for i in ${!paths[@]}; do 10:58:04 if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; 10:58:04 fi; done; 10:58:04 echo "${paths[*]}"'\'')" 10:58:04 export PATH="/opt/pyenv/shims:${PATH}" 10:58:04 export PYENV_SHELL=bash 10:58:04 source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' 10:58:04 pyenv() { 10:58:04 local command 10:58:04 command="${1:-}" 10:58:04 if [ "$#" -gt 0 ]; then 10:58:04 shift 10:58:04 fi 10:58:04 10:58:04 case "$command" in 10:58:04 rehash|shell) 10:58:04 eval "$(pyenv "sh-$command" "$@")" 10:58:04 ;; 10:58:04 *) 10:58:04 command pyenv "$command" "$@" 10:58:04 ;; 10:58:04 esac 10:58:04 }' 10:58:04 +++ bash --norc -ec 'IFS=:; paths=($PATH); 10:58:04 for i in ${!paths[@]}; do 10:58:04 if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; 10:58:04 fi; done; 10:58:04 echo "${paths[*]}"' 10:58:04 ++ PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 10:58:04 ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 10:58:04 ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 10:58:04 ++ export PYENV_SHELL=bash 10:58:04 ++ PYENV_SHELL=bash 10:58:04 ++ source /opt/pyenv/libexec/../completions/pyenv.bash 10:58:04 +++ complete -F _pyenv pyenv 10:58:04 ++ lf-pyver python3 10:58:04 ++ local py_version_xy=python3 10:58:04 ++ local py_version_xyz= 10:58:04 ++ pyenv versions 10:58:04 ++ local command 10:58:04 ++ command=versions 10:58:04 ++ '[' 1 -gt 0 ']' 10:58:04 ++ shift 10:58:04 ++ case "$command" in 10:58:04 ++ command pyenv versions 10:58:04 ++ pyenv versions 10:58:04 ++ awk '{ print $1 }' 10:58:04 ++ grep -E '^[0-9.]*[0-9]$' 10:58:04 ++ sed 's/^[ *]* //' 10:58:04 ++ [[ ! -s /tmp/.pyenv_versions ]] 10:58:04 +++ grep '^3' /tmp/.pyenv_versions 10:58:04 +++ tail -n 1 10:58:04 +++ sort -V 10:58:04 ++ py_version_xyz=3.11.7 10:58:04 ++ [[ -z 3.11.7 ]] 10:58:04 ++ echo 3.11.7 10:58:04 ++ return 0 10:58:04 + pyenv local 3.11.7 10:58:04 + local command 10:58:04 + command=local 10:58:04 + '[' 2 -gt 0 ']' 10:58:04 + shift 10:58:04 + case "$command" in 10:58:04 + command pyenv local 3.11.7 10:58:04 + pyenv local 3.11.7 10:58:04 + for arg in "$@" 10:58:04 + case $arg in 10:58:04 + pkg_list+='tox ' 10:58:04 + for arg in "$@" 10:58:04 + case $arg in 10:58:04 + pkg_list+='virtualenv ' 10:58:04 + for arg in "$@" 10:58:04 + case $arg in 10:58:04 + pkg_list+='urllib3~=1.26.15 ' 10:58:04 + [[ -f /tmp/.toxenv ]] 10:58:04 ++ cat /tmp/.toxenv 10:58:04 + lf_venv=/tmp/venv-s9J1 10:58:04 + echo 'lf-activate-venv(): INFO: Reuse venv:/tmp/venv-s9J1 from' file:/tmp/.toxenv 10:58:04 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-s9J1 from file:/tmp/.toxenv 10:58:04 + /tmp/venv-s9J1/bin/python3 -m pip install --upgrade --quiet pip virtualenv 10:58:05 + [[ -z tox virtualenv urllib3~=1.26.15 ]] 10:58:05 + echo 'lf-activate-venv(): INFO: Installing: tox virtualenv urllib3~=1.26.15 ' 10:58:05 lf-activate-venv(): INFO: Installing: tox virtualenv urllib3~=1.26.15 10:58:05 + /tmp/venv-s9J1/bin/python3 -m pip install --upgrade --quiet --upgrade-strategy eager tox virtualenv urllib3~=1.26.15 10:58:06 + type python3 10:58:06 + true 10:58:06 + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-s9J1/bin to PATH' 10:58:06 lf-activate-venv(): INFO: Adding /tmp/venv-s9J1/bin to PATH 10:58:06 + PATH=/tmp/venv-s9J1/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 10:58:06 + return 0 10:58:06 + [[ -d /opt/pyenv ]] 10:58:06 + echo '---> Setting up pyenv' 10:58:06 ---> Setting up pyenv 10:58:06 + export PYENV_ROOT=/opt/pyenv 10:58:06 + PYENV_ROOT=/opt/pyenv 10:58:06 + export PATH=/opt/pyenv/bin:/tmp/venv-s9J1/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 10:58:06 + PATH=/opt/pyenv/bin:/tmp/venv-s9J1/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 10:58:06 ++ pwd 10:58:06 + PYTHONPATH=/w/workspace/transportpce-tox-verify-scandium 10:58:06 + export PYTHONPATH 10:58:06 + export TOX_TESTENV_PASSENV=PYTHONPATH 10:58:06 + TOX_TESTENV_PASSENV=PYTHONPATH 10:58:06 + tox --version 10:58:06 4.23.2 from /tmp/venv-s9J1/lib/python3.11/site-packages/tox/__init__.py 10:58:07 + PARALLEL=True 10:58:07 + TOX_OPTIONS_LIST= 10:58:07 + [[ -n '' ]] 10:58:07 + case ${PARALLEL,,} in 10:58:07 + TOX_OPTIONS_LIST=' --parallel auto --parallel-live' 10:58:07 + tox --parallel auto --parallel-live 10:58:07 + tee -a /w/workspace/transportpce-tox-verify-scandium/archives/tox/tox.log 10:58:08 checkbashisms: freeze> python -m pip freeze --all 10:58:08 docs: install_deps> python -I -m pip install -r docs/requirements.txt 10:58:08 docs-linkcheck: install_deps> python -I -m pip install -r docs/requirements.txt 10:58:08 buildcontroller: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt 10:58:09 checkbashisms: pip==24.3.1,setuptools==75.2.0,wheel==0.44.0 10:58:09 checkbashisms: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./fixCIcentOS8reposMirrors.sh 10:58:09 checkbashisms: commands[1] /w/workspace/transportpce-tox-verify-scandium/tests> sh -c 'command checkbashisms>/dev/null || sudo yum install -y devscripts-checkbashisms || sudo yum install -y devscripts-minimal || sudo yum install -y devscripts || sudo yum install -y https://archives.fedoraproject.org/pub/archive/fedora/linux/releases/31/Everything/x86_64/os/Packages/d/devscripts-checkbashisms-2.19.6-2.fc31.x86_64.rpm || (echo "checkbashisms command not found - please install it (e.g. sudo apt-get install devscripts | yum install devscripts-minimal )" >&2 && exit 1)' 10:58:09 checkbashisms: commands[2] /w/workspace/transportpce-tox-verify-scandium/tests> find . -not -path '*/\.*' -name '*.sh' -exec checkbashisms -f '{}' + 10:58:10 script ./reflectwarn.sh does not appear to have a #! interpreter line; 10:58:10 you may get strange results 10:58:10 checkbashisms: OK ✔ in 3.09 seconds 10:58:10 pre-commit: install_deps> python -I -m pip install pre-commit 10:58:13 pre-commit: freeze> python -m pip freeze --all 10:58:13 pre-commit: cfgv==3.4.0,distlib==0.3.9,filelock==3.16.1,identify==2.6.1,nodeenv==1.9.1,pip==24.3.1,platformdirs==4.3.6,pre_commit==4.0.1,PyYAML==6.0.2,setuptools==75.2.0,virtualenv==20.27.1,wheel==0.44.0 10:58:13 pre-commit: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./fixCIcentOS8reposMirrors.sh 10:58:13 pre-commit: commands[1] /w/workspace/transportpce-tox-verify-scandium/tests> sh -c 'which cpan || sudo yum install -y perl-CPAN || (echo "cpan command not found - please install it (e.g. sudo apt-get install perl-modules | yum install perl-CPAN )" >&2 && exit 1)' 10:58:13 /usr/bin/cpan 10:58:13 pre-commit: commands[2] /w/workspace/transportpce-tox-verify-scandium/tests> pre-commit run --all-files --show-diff-on-failure 10:58:13 [WARNING] hook id `remove-tabs` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. 10:58:13 [WARNING] hook id `perltidy` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. 10:58:13 [INFO] Initializing environment for https://github.com/pre-commit/pre-commit-hooks. 10:58:14 [WARNING] repo `https://github.com/pre-commit/pre-commit-hooks` uses deprecated stage names (commit, push) which will be removed in a future version. Hint: often `pre-commit autoupdate --repo https://github.com/pre-commit/pre-commit-hooks` will fix this. if it does not -- consider reporting an issue to that repo. 10:58:14 [INFO] Initializing environment for https://github.com/jorisroovers/gitlint. 10:58:14 [INFO] Initializing environment for https://github.com/jorisroovers/gitlint:./gitlint-core[trusted-deps]. 10:58:14 buildcontroller: freeze> python -m pip freeze --all 10:58:14 [INFO] Initializing environment for https://github.com/Lucas-C/pre-commit-hooks. 10:58:15 buildcontroller: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.3,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.3.1,pluggy==1.5.0,psutil==6.1.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.2.0,urllib3==2.2.3,wheel==0.44.0 10:58:15 buildcontroller: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./build_controller.sh 10:58:15 + update-java-alternatives -l 10:58:15 java-1.11.0-openjdk-amd64 1111 /usr/lib/jvm/java-1.11.0-openjdk-amd64 10:58:15 java-1.12.0-openjdk-amd64 1211 /usr/lib/jvm/java-1.12.0-openjdk-amd64 10:58:15 java-1.17.0-openjdk-amd64 1711 /usr/lib/jvm/java-1.17.0-openjdk-amd64 10:58:15 java-1.21.0-openjdk-amd64 2111 /usr/lib/jvm/java-1.21.0-openjdk-amd64 10:58:15 + sudo update-java-alternatives -s java-1.21.0-openjdk-amd64 10:58:15 java-1.8.0-openjdk-amd64 1081 /usr/lib/jvm/java-1.8.0-openjdk-amd64 10:58:15 [INFO] Initializing environment for https://github.com/pre-commit/mirrors-autopep8. 10:58:15 + sed -n ;s/.* version "\(.*\)\.\(.*\)\..*".*$/\1/p; 10:58:15 + java -version 10:58:15 [INFO] Initializing environment for https://github.com/perltidy/perltidy. 10:58:15 21 10:58:15 + JAVA_VER=21 10:58:15 + echo 21 10:58:15 + javac -version 10:58:15 + sed -n ;s/javac \(.*\)\.\(.*\)\..*.*$/\1/p; 10:58:15 21 10:58:15 + JAVAC_VER=21 10:58:15 + echo 21 10:58:15 + [ 21 -ge 21 ] 10:58:15 + [ 21 -ge 21 ] 10:58:15 + echo ok, java is 21 or newer 10:58:15 + wget -nv https://dlcdn.apache.org/maven/maven-3/3.9.8/binaries/apache-maven-3.9.8-bin.tar.gz -P /tmp 10:58:15 ok, java is 21 or newer 10:58:15 2024-10-30 10:58:15 URL:https://dlcdn.apache.org/maven/maven-3/3.9.8/binaries/apache-maven-3.9.8-bin.tar.gz [9083702/9083702] -> "/tmp/apache-maven-3.9.8-bin.tar.gz" [1] 10:58:15 + sudo mkdir -p /opt 10:58:15 + sudo tar xf /tmp/apache-maven-3.9.8-bin.tar.gz -C /opt 10:58:16 + sudo ln -s /opt/apache-maven-3.9.8 /opt/maven 10:58:16 + sudo ln -s /opt/maven/bin/mvn /usr/bin/mvn 10:58:16 + mvn --version 10:58:16 [INFO] Installing environment for https://github.com/pre-commit/pre-commit-hooks. 10:58:16 [INFO] Once installed this environment will be reused. 10:58:16 [INFO] This may take a few minutes... 10:58:16 Apache Maven 3.9.8 (36645f6c9b5079805ea5009217e36f2cffd34256) 10:58:16 Maven home: /opt/maven 10:58:16 Java version: 21.0.4, vendor: Ubuntu, runtime: /usr/lib/jvm/java-21-openjdk-amd64 10:58:16 Default locale: en, platform encoding: UTF-8 10:58:16 OS name: "linux", version: "5.4.0-190-generic", arch: "amd64", family: "unix" 10:58:17 NOTE: Picked up JDK_JAVA_OPTIONS: 10:58:17 --add-opens=java.base/java.io=ALL-UNNAMED 10:58:17 --add-opens=java.base/java.lang=ALL-UNNAMED 10:58:17 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 10:58:17 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 10:58:17 --add-opens=java.base/java.net=ALL-UNNAMED 10:58:17 --add-opens=java.base/java.nio=ALL-UNNAMED 10:58:17 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 10:58:17 --add-opens=java.base/java.nio.file=ALL-UNNAMED 10:58:17 --add-opens=java.base/java.util=ALL-UNNAMED 10:58:17 --add-opens=java.base/java.util.jar=ALL-UNNAMED 10:58:17 --add-opens=java.base/java.util.stream=ALL-UNNAMED 10:58:17 --add-opens=java.base/java.util.zip=ALL-UNNAMED 10:58:17 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 10:58:17 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 10:58:17 -Xlog:disable 10:58:20 [INFO] Installing environment for https://github.com/Lucas-C/pre-commit-hooks. 10:58:20 [INFO] Once installed this environment will be reused. 10:58:20 [INFO] This may take a few minutes... 10:58:27 [INFO] Installing environment for https://github.com/pre-commit/mirrors-autopep8. 10:58:27 [INFO] Once installed this environment will be reused. 10:58:27 [INFO] This may take a few minutes... 10:58:30 [INFO] Installing environment for https://github.com/perltidy/perltidy. 10:58:30 [INFO] Once installed this environment will be reused. 10:58:30 [INFO] This may take a few minutes... 10:58:37 docs: freeze> python -m pip freeze --all 10:58:38 docs: alabaster==1.0.0,attrs==24.2.0,babel==2.16.0,blockdiag==3.0.0,certifi==2024.8.30,charset-normalizer==3.4.0,contourpy==1.3.0,cycler==0.12.1,docutils==0.21.2,fonttools==4.54.1,funcparserlib==2.0.0a0,future==1.0.0,idna==3.10,imagesize==1.4.1,Jinja2==3.1.4,jsonschema==3.2.0,kiwisolver==1.4.7,lfdocs-conf==0.9.0,MarkupSafe==3.0.2,matplotlib==3.9.2,numpy==2.1.2,nwdiag==3.0.0,packaging==24.1,pillow==11.0.0,pip==24.3.1,Pygments==2.18.0,pyparsing==3.2.0,pyrsistent==0.20.0,python-dateutil==2.9.0.post0,PyYAML==6.0.2,requests==2.32.3,requests-file==1.5.1,seqdiag==3.0.0,setuptools==75.2.0,six==1.16.0,snowballstemmer==2.2.0,Sphinx==8.1.3,sphinx-bootstrap-theme==0.8.1,sphinx-data-viewer==0.1.5,sphinx-rtd-theme==3.0.1,sphinx-tabs==3.4.7,sphinxcontrib-applehelp==2.0.0,sphinxcontrib-blockdiag==3.0.0,sphinxcontrib-devhelp==2.0.0,sphinxcontrib-htmlhelp==2.1.0,sphinxcontrib-jquery==4.1,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-needs==0.7.9,sphinxcontrib-nwdiag==2.0.0,sphinxcontrib-plantuml==0.30,sphinxcontrib-qthelp==2.0.0,sphinxcontrib-seqdiag==3.0.0,sphinxcontrib-serializinghtml==2.0.0,sphinxcontrib-swaggerdoc==0.1.7,urllib3==2.2.3,webcolors==24.8.0,wheel==0.44.0 10:58:38 docs: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> sphinx-build -q -W --keep-going -b html -n -d /w/workspace/transportpce-tox-verify-scandium/.tox/docs/tmp/doctrees ../docs/ /w/workspace/transportpce-tox-verify-scandium/docs/_build/html 10:58:38 docs-linkcheck: freeze> python -m pip freeze --all 10:58:38 docs-linkcheck: alabaster==1.0.0,attrs==24.2.0,babel==2.16.0,blockdiag==3.0.0,certifi==2024.8.30,charset-normalizer==3.4.0,contourpy==1.3.0,cycler==0.12.1,docutils==0.21.2,fonttools==4.54.1,funcparserlib==2.0.0a0,future==1.0.0,idna==3.10,imagesize==1.4.1,Jinja2==3.1.4,jsonschema==3.2.0,kiwisolver==1.4.7,lfdocs-conf==0.9.0,MarkupSafe==3.0.2,matplotlib==3.9.2,numpy==2.1.2,nwdiag==3.0.0,packaging==24.1,pillow==11.0.0,pip==24.3.1,Pygments==2.18.0,pyparsing==3.2.0,pyrsistent==0.20.0,python-dateutil==2.9.0.post0,PyYAML==6.0.2,requests==2.32.3,requests-file==1.5.1,seqdiag==3.0.0,setuptools==75.2.0,six==1.16.0,snowballstemmer==2.2.0,Sphinx==8.1.3,sphinx-bootstrap-theme==0.8.1,sphinx-data-viewer==0.1.5,sphinx-rtd-theme==3.0.1,sphinx-tabs==3.4.7,sphinxcontrib-applehelp==2.0.0,sphinxcontrib-blockdiag==3.0.0,sphinxcontrib-devhelp==2.0.0,sphinxcontrib-htmlhelp==2.1.0,sphinxcontrib-jquery==4.1,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-needs==0.7.9,sphinxcontrib-nwdiag==2.0.0,sphinxcontrib-plantuml==0.30,sphinxcontrib-qthelp==2.0.0,sphinxcontrib-seqdiag==3.0.0,sphinxcontrib-serializinghtml==2.0.0,sphinxcontrib-swaggerdoc==0.1.7,urllib3==2.2.3,webcolors==24.8.0,wheel==0.44.0 10:58:38 docs-linkcheck: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> sphinx-build -q -b linkcheck -d /w/workspace/transportpce-tox-verify-scandium/.tox/docs-linkcheck/tmp/doctrees ../docs/ /w/workspace/transportpce-tox-verify-scandium/docs/_build/linkcheck 10:58:40 docs: OK ✔ in 33.54 seconds 10:58:40 pylint: install_deps> python -I -m pip install 'pylint>=2.6.0' 10:58:42 trim trailing whitespace.................................................Passed 10:58:42 Tabs remover.............................................................Passed 10:58:42 autopep8.................................................................docs-linkcheck: OK ✔ in 35.11 seconds 10:58:44 pylint: freeze> python -m pip freeze --all 10:58:45 pylint: astroid==3.3.5,dill==0.3.9,isort==5.13.2,mccabe==0.7.0,pip==24.3.1,platformdirs==4.3.6,pylint==3.3.1,setuptools==75.2.0,tomlkit==0.13.2,wheel==0.44.0 10:58:45 pylint: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> find transportpce_tests/ -name '*.py' -exec pylint --fail-under=10 --max-line-length=120 --disable=missing-docstring,import-error --disable=fixme --disable=duplicate-code '--module-rgx=([a-z0-9_]+$)|([0-9.]{1,30}$)' '--method-rgx=(([a-z_][a-zA-Z0-9_]{2,})|(_[a-z0-9_]*)|(__[a-zA-Z][a-zA-Z0-9_]+__))$' '--variable-rgx=[a-zA-Z_][a-zA-Z0-9_]{1,30}$' '{}' + 10:58:47 Passed 10:58:47 perltidy.................................................................Passed 10:58:48 pre-commit: commands[3] /w/workspace/transportpce-tox-verify-scandium/tests> pre-commit run gitlint-ci --hook-stage manual 10:58:48 [WARNING] hook id `remove-tabs` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. 10:58:48 [WARNING] hook id `perltidy` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. 10:58:48 [INFO] Installing environment for https://github.com/jorisroovers/gitlint. 10:58:48 [INFO] Once installed this environment will be reused. 10:58:48 [INFO] This may take a few minutes... 10:58:54 gitlint..................................................................Passed 10:59:05 10:59:05 ------------------------------------ 10:59:05 Your code has been rated at 10.00/10 10:59:05 10:59:51 pre-commit: OK ✔ in 44.52 seconds 10:59:51 pylint: OK ✔ in 26.27 seconds 10:59:51 buildcontroller: OK ✔ in 1 minute 42.78 seconds 10:59:51 build_karaf_tests121: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt 10:59:51 build_karaf_tests221: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt 10:59:51 sims: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt 10:59:51 testsPCE: install_deps> python -I -m pip install gnpy4tpce==2.4.7 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt 10:59:57 build_karaf_tests121: freeze> python -m pip freeze --all 10:59:57 build_karaf_tests221: freeze> python -m pip freeze --all 10:59:57 sims: freeze> python -m pip freeze --all 10:59:57 build_karaf_tests121: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.3,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.3.1,pluggy==1.5.0,psutil==6.1.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.2.0,urllib3==2.2.3,wheel==0.44.0 10:59:57 build_karaf_tests121: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./build_karaf_for_tests.sh 10:59:57 build_karaf_tests221: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.3,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.3.1,pluggy==1.5.0,psutil==6.1.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.2.0,urllib3==2.2.3,wheel==0.44.0 10:59:57 build_karaf_tests221: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./build_karaf_for_tests.sh 10:59:57 NOTE: Picked up JDK_JAVA_OPTIONS: 10:59:57 --add-opens=java.base/java.io=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.lang=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.net=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.nio=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.nio.file=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.util=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.util.jar=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.util.stream=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.util.zip=ALL-UNNAMED 10:59:57 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 10:59:57 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 10:59:57 -Xlog:disable 10:59:57 NOTE: Picked up JDK_JAVA_OPTIONS: 10:59:57 --add-opens=java.base/java.io=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.lang=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.net=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.nio=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.nio.file=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.util=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.util.jar=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.util.stream=ALL-UNNAMED 10:59:57 --add-opens=java.base/java.util.zip=ALL-UNNAMED 10:59:57 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 10:59:57 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 10:59:57 -Xlog:disable 10:59:57 sims: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.3,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.3.1,pluggy==1.5.0,psutil==6.1.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.2.0,urllib3==2.2.3,wheel==0.44.0 10:59:57 sims: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./install_lightynode.sh 10:59:57 Using lighynode version 20.1.0.2 10:59:57 Installing lightynode device to ./lightynode/lightynode-openroadm-device directory 11:00:01 sims: OK ✔ in 11.23 seconds 11:00:01 build_karaf_tests71: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt 11:00:14 build_karaf_tests71: freeze> python -m pip freeze --all 11:00:14 build_karaf_tests71: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.3,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.3.1,pluggy==1.5.0,psutil==6.1.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.2.0,urllib3==2.2.3,wheel==0.44.0 11:00:14 build_karaf_tests71: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./build_karaf_for_tests.sh 11:00:15 NOTE: Picked up JDK_JAVA_OPTIONS: 11:00:15 --add-opens=java.base/java.io=ALL-UNNAMED 11:00:15 --add-opens=java.base/java.lang=ALL-UNNAMED 11:00:15 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 11:00:15 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 11:00:15 --add-opens=java.base/java.net=ALL-UNNAMED 11:00:15 --add-opens=java.base/java.nio=ALL-UNNAMED 11:00:15 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 11:00:15 --add-opens=java.base/java.nio.file=ALL-UNNAMED 11:00:15 --add-opens=java.base/java.util=ALL-UNNAMED 11:00:15 --add-opens=java.base/java.util.jar=ALL-UNNAMED 11:00:15 --add-opens=java.base/java.util.stream=ALL-UNNAMED 11:00:15 --add-opens=java.base/java.util.zip=ALL-UNNAMED 11:00:15 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 11:00:15 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 11:00:15 -Xlog:disable 11:00:42 build_karaf_tests221: OK ✔ in 52.54 seconds 11:00:42 build_karaf_tests_hybrid: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt 11:00:43 build_karaf_tests121: OK ✔ in 53.6 seconds 11:00:43 tests_tapi: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt 11:00:49 build_karaf_tests_hybrid: freeze> python -m pip freeze --all 11:00:49 tests_tapi: freeze> python -m pip freeze --all 11:00:50 tests_tapi: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.3,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.3.1,pluggy==1.5.0,psutil==6.1.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.2.0,urllib3==2.2.3,wheel==0.44.0 11:00:50 tests_tapi: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh tapi 11:00:50 build_karaf_tests_hybrid: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.3,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.3.1,pluggy==1.5.0,psutil==6.1.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.2.0,urllib3==2.2.3,wheel==0.44.0 11:00:50 build_karaf_tests_hybrid: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./build_karaf_for_tests.sh 11:00:50 using environment variables from ./karaf221.env 11:00:50 pytest -q transportpce_tests/tapi/test01_abstracted_topology.py 11:00:50 NOTE: Picked up JDK_JAVA_OPTIONS: 11:00:50 --add-opens=java.base/java.io=ALL-UNNAMED 11:00:50 --add-opens=java.base/java.lang=ALL-UNNAMED 11:00:50 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 11:00:50 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 11:00:50 --add-opens=java.base/java.net=ALL-UNNAMED 11:00:50 --add-opens=java.base/java.nio=ALL-UNNAMED 11:00:50 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 11:00:50 --add-opens=java.base/java.nio.file=ALL-UNNAMED 11:00:50 --add-opens=java.base/java.util=ALL-UNNAMED 11:00:50 --add-opens=java.base/java.util.jar=ALL-UNNAMED 11:00:50 --add-opens=java.base/java.util.stream=ALL-UNNAMED 11:00:50 --add-opens=java.base/java.util.zip=ALL-UNNAMED 11:00:50 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 11:00:50 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 11:00:50 -Xlog:disable 11:01:08 build_karaf_tests71: OK ✔ in 1 minute 2.42 seconds 11:01:08 testsPCE: freeze> python -m pip freeze --all 11:01:08 testsPCE: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,click==8.1.7,contourpy==1.3.0,cryptography==3.3.2,cycler==0.12.1,dict2xml==1.7.6,Flask==2.1.3,Flask-Injector==0.14.0,fonttools==4.54.1,gnpy4tpce==2.4.7,idna==3.10,iniconfig==2.0.0,injector==0.22.0,itsdangerous==2.2.0,Jinja2==3.1.4,kiwisolver==1.4.7,lxml==5.3.0,MarkupSafe==3.0.2,matplotlib==3.9.2,netconf-client==3.1.1,networkx==2.8.8,numpy==1.26.4,packaging==24.1,pandas==1.5.3,paramiko==3.5.0,pbr==5.11.1,pillow==11.0.0,pip==24.3.1,pluggy==1.5.0,psutil==6.1.0,pycparser==2.22,PyNaCl==1.5.0,pyparsing==3.2.0,pytest==8.3.3,python-dateutil==2.9.0.post0,pytz==2024.2,requests==2.32.3,scipy==1.14.1,setuptools==50.3.2,six==1.16.0,urllib3==2.2.3,Werkzeug==2.0.3,wheel==0.44.0,xlrd==1.2.0 11:01:08 testsPCE: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh pce 11:01:08 pytest -q transportpce_tests/pce/test01_pce.py 11:02:08 ....................................... [100%] 11:03:11 20 passed in 122.72s (0:02:02) 11:03:11 pytest -q transportpce_tests/pce/test02_pce_400G.py 11:03:12 ...................... [100%] 11:03:54 9 passed in 42.33s 11:03:54 pytest -q transportpce_tests/pce/test03_gnpy.py 11:04:02 .............. [100%] 11:04:31 8 passed in 36.78s 11:04:31 pytest -q transportpce_tests/pce/test04_pce_bug_fix.py 11:04:46 ............ [100%] 11:04:51 50 passed in 241.00s (0:04:00) 11:04:51 pytest -q transportpce_tests/tapi/test02_full_topology.py 11:05:02 ... [100%] 11:05:08 3 passed in 36.84s 11:05:08 build_karaf_tests_hybrid: OK ✔ in 56.81 seconds 11:05:08 testsPCE: OK ✔ in 5 minutes 18.3 seconds 11:05:08 tests121: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt 11:05:14 tests121: freeze> python -m pip freeze --all 11:05:14 tests121: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.3,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.3.1,pluggy==1.5.0,psutil==6.1.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.2.0,urllib3==2.2.3,wheel==0.44.0 11:05:14 tests121: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh 1.2.1 11:05:14 using environment variables from ./karaf121.env 11:05:14 pytest -q transportpce_tests/1.2.1/test01_portmapping.py 11:06:12 ................................... [100%] 11:07:01 21 passed in 107.32s (0:01:47) 11:07:01 pytest -q transportpce_tests/1.2.1/test02_topo_portmapping.py 11:07:23 ........ [100%] 11:07:48 6 passed in 46.11s 11:07:48 pytest -q transportpce_tests/1.2.1/test03_topology.py 11:07:48 ................................................ [100%] 11:09:42 30 passed in 290.45s (0:04:50) 11:09:42 pytest -q transportpce_tests/tapi/test03_tapi_device_change_notifications.py 11:09:43 .......... [100%] 11:10:05 44 passed in 136.63s (0:02:16) 11:10:05 pytest -q transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py 11:10:53 ..................................... [100%] 11:11:45 24 passed in 100.33s (0:01:40) 11:11:45 pytest -q transportpce_tests/1.2.1/test05_olm.py 11:11:47 ........................................................................................... [100%] 11:14:25 70 passed in 282.70s (0:04:42) 11:14:25 tests_tapi: OK ✔ in 13 minutes 41.7 seconds 11:14:25 tests71: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt 11:14:31 tests71: freeze> python -m pip freeze --all 11:14:31 tests71: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.3,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.3.1,pluggy==1.5.0,psutil==6.1.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.2.0,urllib3==2.2.3,wheel==0.44.0 11:14:31 tests71: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh 7.1 11:14:31 using environment variables from ./karaf71.env 11:14:31 pytest -q transportpce_tests/7.1/test01_portmapping.py 11:14:32 ...... [100%] 11:14:45 40 passed in 180.07s (0:03:00) 11:14:46 pytest -q transportpce_tests/1.2.1/test06_end2end.py 11:15:08 ............ [100%] 11:15:21 12 passed in 49.52s 11:15:21 pytest -q transportpce_tests/7.1/test02_otn_renderer.py 11:15:33 FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF.............................................................. [100%] 11:18:00 62 passed in 158.86s (0:02:38) 11:18:00 pytest -q transportpce_tests/7.1/test03_renderer_or_modes.py 11:18:30 ....F............................................ [100%] 11:20:14 48 passed in 133.79s (0:02:13) 11:20:14 pytest -q transportpce_tests/7.1/test04_renderer_regen_mode.py 11:20:38 ...................... [100%] 11:21:25 22 passed in 71.43s (0:01:11) 11:21:26 tests71: OK ✔ in 7 minutes 0.72 seconds 11:21:26 tests221: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt 11:21:32 tests221: freeze> python -m pip freeze --all 11:21:32 tests221: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.3,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.3.1,pluggy==1.5.0,psutil==6.1.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.2.0,urllib3==2.2.3,wheel==0.44.0 11:21:32 tests221: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh 2.2.1 11:21:32 using environment variables from ./karaf221.env 11:21:32 pytest -q transportpce_tests/2.2.1/test01_portmapping.py 11:21:41 FFF [100%] 11:21:45 =================================== FAILURES =================================== 11:21:45 ________________ TransportPCEFulltesting.test_01_connect_xpdrA _________________ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 > sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:45 raise err 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:45 socket_options = [(6, 1, 1)] 11:21:45 11:21:45 def create_connection( 11:21:45 address: tuple[str, int], 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 source_address: tuple[str, int] | None = None, 11:21:45 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:45 ) -> socket.socket: 11:21:45 """Connect to *address* and return the socket object. 11:21:45 11:21:45 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:45 port)``) and return the socket object. Passing the optional 11:21:45 *timeout* parameter will set the timeout on the socket instance 11:21:45 before attempting to connect. If no *timeout* is supplied, the 11:21:45 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:45 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:45 for the socket to bind as a source address before making the connection. 11:21:45 An host of '' or port 0 tells the OS to use the default. 11:21:45 """ 11:21:45 11:21:45 host, port = address 11:21:45 if host.startswith("["): 11:21:45 host = host.strip("[]") 11:21:45 err = None 11:21:45 11:21:45 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:45 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:45 # The original create_connection function always returns all records. 11:21:45 family = allowed_gai_family() 11:21:45 11:21:45 try: 11:21:45 host.encode("idna") 11:21:45 except UnicodeError: 11:21:45 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:45 11:21:45 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:45 af, socktype, proto, canonname, sa = res 11:21:45 sock = None 11:21:45 try: 11:21:45 sock = socket.socket(af, socktype, proto) 11:21:45 11:21:45 # If provided, set socket level options before connecting. 11:21:45 _set_socket_options(sock, socket_options) 11:21:45 11:21:45 if timeout is not _DEFAULT_TIMEOUT: 11:21:45 sock.settimeout(timeout) 11:21:45 if source_address: 11:21:45 sock.bind(source_address) 11:21:45 > sock.connect(sa) 11:21:45 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 method = 'PUT' 11:21:45 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01' 11:21:45 body = '{"node": [{"node-id": "XPDRA01", "netconf-node-topology:netconf-node": {"netconf-node-topology:host": "127.0.0.1", "n...ff-millis": 1800000, "netconf-node-topology:backoff-multiplier": 1.5, "netconf-node-topology:keepalive-delay": 120}}]}' 11:21:45 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '709', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:45 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 redirect = False, assert_same_host = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:45 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:45 decode_content = False, response_kw = {} 11:21:45 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01', query=None, fragment=None) 11:21:45 destination_scheme = None, conn = None, release_this_conn = True 11:21:45 http_tunnel_required = False, err = None, clean_exit = False 11:21:45 11:21:45 def urlopen( # type: ignore[override] 11:21:45 self, 11:21:45 method: str, 11:21:45 url: str, 11:21:45 body: _TYPE_BODY | None = None, 11:21:45 headers: typing.Mapping[str, str] | None = None, 11:21:45 retries: Retry | bool | int | None = None, 11:21:45 redirect: bool = True, 11:21:45 assert_same_host: bool = True, 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 pool_timeout: int | None = None, 11:21:45 release_conn: bool | None = None, 11:21:45 chunked: bool = False, 11:21:45 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:45 preload_content: bool = True, 11:21:45 decode_content: bool = True, 11:21:45 **response_kw: typing.Any, 11:21:45 ) -> BaseHTTPResponse: 11:21:45 """ 11:21:45 Get a connection from the pool and perform an HTTP request. This is the 11:21:45 lowest level call for making a request, so you'll need to specify all 11:21:45 the raw details. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 More commonly, it's appropriate to use a convenience method 11:21:45 such as :meth:`request`. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 `release_conn` will only behave as expected if 11:21:45 `preload_content=False` because we want to make 11:21:45 `preload_content=False` the default behaviour someday soon without 11:21:45 breaking backwards compatibility. 11:21:45 11:21:45 :param method: 11:21:45 HTTP request method (such as GET, POST, PUT, etc.) 11:21:45 11:21:45 :param url: 11:21:45 The URL to perform the request on. 11:21:45 11:21:45 :param body: 11:21:45 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:45 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:45 11:21:45 :param headers: 11:21:45 Dictionary of custom headers to send, such as User-Agent, 11:21:45 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:45 these headers completely replace any pool-specific headers. 11:21:45 11:21:45 :param retries: 11:21:45 Configure the number of retries to allow before raising a 11:21:45 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:45 11:21:45 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:45 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:45 over different types of retries. 11:21:45 Pass an integer number to retry connection errors that many times, 11:21:45 but no other types of errors. Pass zero to never retry. 11:21:45 11:21:45 If ``False``, then retries are disabled and any exception is raised 11:21:45 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:45 the redirect response will be returned. 11:21:45 11:21:45 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:45 11:21:45 :param redirect: 11:21:45 If True, automatically handle redirects (status codes 301, 302, 11:21:45 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:45 will disable redirect, too. 11:21:45 11:21:45 :param assert_same_host: 11:21:45 If ``True``, will make sure that the host of the pool requests is 11:21:45 consistent else will raise HostChangedError. When ``False``, you can 11:21:45 use the pool on an HTTP proxy and request foreign hosts. 11:21:45 11:21:45 :param timeout: 11:21:45 If specified, overrides the default timeout for this one 11:21:45 request. It may be a float (in seconds) or an instance of 11:21:45 :class:`urllib3.util.Timeout`. 11:21:45 11:21:45 :param pool_timeout: 11:21:45 If set and the pool is set to block=True, then this method will 11:21:45 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:45 connection is available within the time period. 11:21:45 11:21:45 :param bool preload_content: 11:21:45 If True, the response's body will be preloaded into memory. 11:21:45 11:21:45 :param bool decode_content: 11:21:45 If True, will attempt to decode the body based on the 11:21:45 'content-encoding' header. 11:21:45 11:21:45 :param release_conn: 11:21:45 If False, then the urlopen call will not release the connection 11:21:45 back into the pool once a response is received (but will release if 11:21:45 you read the entire contents of the response such as when 11:21:45 `preload_content=True`). This is useful if you're not preloading 11:21:45 the response's content immediately. You will need to call 11:21:45 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:45 back into the pool. If None, it takes the value of ``preload_content`` 11:21:45 which defaults to ``True``. 11:21:45 11:21:45 :param bool chunked: 11:21:45 If True, urllib3 will send the body using chunked transfer 11:21:45 encoding. Otherwise, urllib3 will send the body using the standard 11:21:45 content-length form. Defaults to False. 11:21:45 11:21:45 :param int body_pos: 11:21:45 Position to seek to in file-like body in the event of a retry or 11:21:45 redirect. Typically this won't need to be set because urllib3 will 11:21:45 auto-populate the value when needed. 11:21:45 """ 11:21:45 parsed_url = parse_url(url) 11:21:45 destination_scheme = parsed_url.scheme 11:21:45 11:21:45 if headers is None: 11:21:45 headers = self.headers 11:21:45 11:21:45 if not isinstance(retries, Retry): 11:21:45 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:45 11:21:45 if release_conn is None: 11:21:45 release_conn = preload_content 11:21:45 11:21:45 # Check host 11:21:45 if assert_same_host and not self.is_same_host(url): 11:21:45 raise HostChangedError(self, url, retries) 11:21:45 11:21:45 # Ensure that the URL we're connecting to is properly encoded 11:21:45 if url.startswith("/"): 11:21:45 url = to_str(_encode_target(url)) 11:21:45 else: 11:21:45 url = to_str(parsed_url.url) 11:21:45 11:21:45 conn = None 11:21:45 11:21:45 # Track whether `conn` needs to be released before 11:21:45 # returning/raising/recursing. Update this variable if necessary, and 11:21:45 # leave `release_conn` constant throughout the function. That way, if 11:21:45 # the function recurses, the original value of `release_conn` will be 11:21:45 # passed down into the recursive call, and its value will be respected. 11:21:45 # 11:21:45 # See issue #651 [1] for details. 11:21:45 # 11:21:45 # [1] 11:21:45 release_this_conn = release_conn 11:21:45 11:21:45 http_tunnel_required = connection_requires_http_tunnel( 11:21:45 self.proxy, self.proxy_config, destination_scheme 11:21:45 ) 11:21:45 11:21:45 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:45 # have to copy the headers dict so we can safely change it without those 11:21:45 # changes being reflected in anyone else's copy. 11:21:45 if not http_tunnel_required: 11:21:45 headers = headers.copy() # type: ignore[attr-defined] 11:21:45 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:45 11:21:45 # Must keep the exception bound to a separate variable or else Python 3 11:21:45 # complains about UnboundLocalError. 11:21:45 err = None 11:21:45 11:21:45 # Keep track of whether we cleanly exited the except block. This 11:21:45 # ensures we do proper cleanup in finally. 11:21:45 clean_exit = False 11:21:45 11:21:45 # Rewind body position, if needed. Record current position 11:21:45 # for future rewinds in the event of a redirect/retry. 11:21:45 body_pos = set_file_position(body, body_pos) 11:21:45 11:21:45 try: 11:21:45 # Request a connection from the queue. 11:21:45 timeout_obj = self._get_timeout(timeout) 11:21:45 conn = self._get_conn(timeout=pool_timeout) 11:21:45 11:21:45 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:45 11:21:45 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:45 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:45 try: 11:21:45 self._prepare_proxy(conn) 11:21:45 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:45 self._raise_timeout( 11:21:45 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:45 ) 11:21:45 raise 11:21:45 11:21:45 # If we're going to release the connection in ``finally:``, then 11:21:45 # the response doesn't need to know about the connection. Otherwise 11:21:45 # it will also try to release it and we'll have a double-release 11:21:45 # mess. 11:21:45 response_conn = conn if not release_conn else None 11:21:45 11:21:45 # Make the request on the HTTPConnection object 11:21:45 > response = self._make_request( 11:21:45 conn, 11:21:45 method, 11:21:45 url, 11:21:45 timeout=timeout_obj, 11:21:45 body=body, 11:21:45 headers=headers, 11:21:45 chunked=chunked, 11:21:45 retries=retries, 11:21:45 response_conn=response_conn, 11:21:45 preload_content=preload_content, 11:21:45 decode_content=decode_content, 11:21:45 **response_kw, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:45 conn.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:45 self.endheaders() 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:45 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:45 self.send(msg) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:45 self.connect() 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:45 self.sock = self._new_conn() 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 except socket.gaierror as e: 11:21:45 raise NameResolutionError(self.host, self, e) from e 11:21:45 except SocketTimeout as e: 11:21:45 raise ConnectTimeoutError( 11:21:45 self, 11:21:45 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:45 ) from e 11:21:45 11:21:45 except OSError as e: 11:21:45 > raise NewConnectionError( 11:21:45 self, f"Failed to establish a new connection: {e}" 11:21:45 ) from e 11:21:45 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 > resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:45 retries = retries.increment( 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 method = 'PUT' 11:21:45 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01' 11:21:45 response = None 11:21:45 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:45 _pool = 11:21:45 _stacktrace = 11:21:45 11:21:45 def increment( 11:21:45 self, 11:21:45 method: str | None = None, 11:21:45 url: str | None = None, 11:21:45 response: BaseHTTPResponse | None = None, 11:21:45 error: Exception | None = None, 11:21:45 _pool: ConnectionPool | None = None, 11:21:45 _stacktrace: TracebackType | None = None, 11:21:45 ) -> Self: 11:21:45 """Return a new Retry object with incremented retry counters. 11:21:45 11:21:45 :param response: A response object, or None, if the server did not 11:21:45 return a response. 11:21:45 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:45 :param Exception error: An error encountered during the request, or 11:21:45 None if the response was received successfully. 11:21:45 11:21:45 :return: A new ``Retry`` object. 11:21:45 """ 11:21:45 if self.total is False and error: 11:21:45 # Disabled, indicate to re-raise the error. 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 11:21:45 total = self.total 11:21:45 if total is not None: 11:21:45 total -= 1 11:21:45 11:21:45 connect = self.connect 11:21:45 read = self.read 11:21:45 redirect = self.redirect 11:21:45 status_count = self.status 11:21:45 other = self.other 11:21:45 cause = "unknown" 11:21:45 status = None 11:21:45 redirect_location = None 11:21:45 11:21:45 if error and self._is_connection_error(error): 11:21:45 # Connect retry? 11:21:45 if connect is False: 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif connect is not None: 11:21:45 connect -= 1 11:21:45 11:21:45 elif error and self._is_read_error(error): 11:21:45 # Read retry? 11:21:45 if read is False or method is None or not self._is_method_retryable(method): 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif read is not None: 11:21:45 read -= 1 11:21:45 11:21:45 elif error: 11:21:45 # Other retry? 11:21:45 if other is not None: 11:21:45 other -= 1 11:21:45 11:21:45 elif response and response.get_redirect_location(): 11:21:45 # Redirect retry? 11:21:45 if redirect is not None: 11:21:45 redirect -= 1 11:21:45 cause = "too many redirects" 11:21:45 response_redirect_location = response.get_redirect_location() 11:21:45 if response_redirect_location: 11:21:45 redirect_location = response_redirect_location 11:21:45 status = response.status 11:21:45 11:21:45 else: 11:21:45 # Incrementing because of a server error like a 500 in 11:21:45 # status_forcelist and the given method is in the allowed_methods 11:21:45 cause = ResponseError.GENERIC_ERROR 11:21:45 if response and response.status: 11:21:45 if status_count is not None: 11:21:45 status_count -= 1 11:21:45 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:45 status = response.status 11:21:45 11:21:45 history = self.history + ( 11:21:45 RequestHistory(method, url, error, status, redirect_location), 11:21:45 ) 11:21:45 11:21:45 new_retry = self.new( 11:21:45 total=total, 11:21:45 connect=connect, 11:21:45 read=read, 11:21:45 redirect=redirect, 11:21:45 status=status_count, 11:21:45 other=other, 11:21:45 history=history, 11:21:45 ) 11:21:45 11:21:45 if new_retry.is_exhausted(): 11:21:45 reason = error or ResponseError(cause) 11:21:45 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:45 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:45 11:21:45 During handling of the above exception, another exception occurred: 11:21:45 11:21:45 self = 11:21:45 11:21:45 def test_01_connect_xpdrA(self): 11:21:45 > response = test_utils.mount_device("XPDRA01", ('xpdra', self.NODE_VERSION)) 11:21:45 11:21:45 transportpce_tests/1.2.1/test06_end2end.py:94: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 transportpce_tests/common/test_utils.py:343: in mount_device 11:21:45 response = put_request(url[RESTCONF_VERSION].format('{}', node), body) 11:21:45 transportpce_tests/common/test_utils.py:124: in put_request 11:21:45 return requests.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:45 return session.request(method=method, url=url, **kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:45 resp = self.send(prep, **send_kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:45 r = adapter.send(request, **kwargs) 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 except (ProtocolError, OSError) as err: 11:21:45 raise ConnectionError(err, request=request) 11:21:45 11:21:45 except MaxRetryError as e: 11:21:45 if isinstance(e.reason, ConnectTimeoutError): 11:21:45 # TODO: Remove this in 3.0.0: see #2811 11:21:45 if not isinstance(e.reason, NewConnectionError): 11:21:45 raise ConnectTimeout(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, ResponseError): 11:21:45 raise RetryError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _ProxyError): 11:21:45 raise ProxyError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _SSLError): 11:21:45 # This branch is for urllib3 v1.22 and later. 11:21:45 raise SSLError(e, request=request) 11:21:45 11:21:45 > raise ConnectionError(e, request=request) 11:21:45 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:45 ---------------------------- Captured stdout setup ----------------------------- 11:21:45 starting OpenDaylight... 11:21:45 starting KARAF TransportPCE build... 11:21:45 Searching for patterns in karaf.log... Pattern found! OpenDaylight started ! 11:21:45 starting simulator xpdra in OpenROADM device version 1.2.1... 11:21:45 Searching for patterns in xpdra-121.log... Pattern found! simulator for xpdra started 11:21:45 starting simulator roadma-full in OpenROADM device version 1.2.1... 11:21:45 Searching for patterns in roadma-121.log... Pattern found! simulator for roadma-full started 11:21:45 starting simulator roadmc-full in OpenROADM device version 1.2.1... 11:21:45 Searching for patterns in roadmc-121.log... Pattern found! simulator for roadmc-full started 11:21:45 starting simulator xpdrc in OpenROADM device version 1.2.1... 11:21:45 Searching for patterns in xpdrc-121.log... Pattern found! simulator for xpdrc started 11:21:45 ----------------------------- Captured stdout call ----------------------------- 11:21:45 execution of test_01_connect_xpdrA 11:21:45 ________________ TransportPCEFulltesting.test_02_connect_xpdrC _________________ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 > sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:45 raise err 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:45 socket_options = [(6, 1, 1)] 11:21:45 11:21:45 def create_connection( 11:21:45 address: tuple[str, int], 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 source_address: tuple[str, int] | None = None, 11:21:45 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:45 ) -> socket.socket: 11:21:45 """Connect to *address* and return the socket object. 11:21:45 11:21:45 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:45 port)``) and return the socket object. Passing the optional 11:21:45 *timeout* parameter will set the timeout on the socket instance 11:21:45 before attempting to connect. If no *timeout* is supplied, the 11:21:45 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:45 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:45 for the socket to bind as a source address before making the connection. 11:21:45 An host of '' or port 0 tells the OS to use the default. 11:21:45 """ 11:21:45 11:21:45 host, port = address 11:21:45 if host.startswith("["): 11:21:45 host = host.strip("[]") 11:21:45 err = None 11:21:45 11:21:45 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:45 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:45 # The original create_connection function always returns all records. 11:21:45 family = allowed_gai_family() 11:21:45 11:21:45 try: 11:21:45 host.encode("idna") 11:21:45 except UnicodeError: 11:21:45 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:45 11:21:45 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:45 af, socktype, proto, canonname, sa = res 11:21:45 sock = None 11:21:45 try: 11:21:45 sock = socket.socket(af, socktype, proto) 11:21:45 11:21:45 # If provided, set socket level options before connecting. 11:21:45 _set_socket_options(sock, socket_options) 11:21:45 11:21:45 if timeout is not _DEFAULT_TIMEOUT: 11:21:45 sock.settimeout(timeout) 11:21:45 if source_address: 11:21:45 sock.bind(source_address) 11:21:45 > sock.connect(sa) 11:21:45 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 method = 'PUT' 11:21:45 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRC01' 11:21:45 body = '{"node": [{"node-id": "XPDRC01", "netconf-node-topology:netconf-node": {"netconf-node-topology:host": "127.0.0.1", "n...ff-millis": 1800000, "netconf-node-topology:backoff-multiplier": 1.5, "netconf-node-topology:keepalive-delay": 120}}]}' 11:21:45 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '709', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:45 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 redirect = False, assert_same_host = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:45 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:45 decode_content = False, response_kw = {} 11:21:45 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRC01', query=None, fragment=None) 11:21:45 destination_scheme = None, conn = None, release_this_conn = True 11:21:45 http_tunnel_required = False, err = None, clean_exit = False 11:21:45 11:21:45 def urlopen( # type: ignore[override] 11:21:45 self, 11:21:45 method: str, 11:21:45 url: str, 11:21:45 body: _TYPE_BODY | None = None, 11:21:45 headers: typing.Mapping[str, str] | None = None, 11:21:45 retries: Retry | bool | int | None = None, 11:21:45 redirect: bool = True, 11:21:45 assert_same_host: bool = True, 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 pool_timeout: int | None = None, 11:21:45 release_conn: bool | None = None, 11:21:45 chunked: bool = False, 11:21:45 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:45 preload_content: bool = True, 11:21:45 decode_content: bool = True, 11:21:45 **response_kw: typing.Any, 11:21:45 ) -> BaseHTTPResponse: 11:21:45 """ 11:21:45 Get a connection from the pool and perform an HTTP request. This is the 11:21:45 lowest level call for making a request, so you'll need to specify all 11:21:45 the raw details. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 More commonly, it's appropriate to use a convenience method 11:21:45 such as :meth:`request`. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 `release_conn` will only behave as expected if 11:21:45 `preload_content=False` because we want to make 11:21:45 `preload_content=False` the default behaviour someday soon without 11:21:45 breaking backwards compatibility. 11:21:45 11:21:45 :param method: 11:21:45 HTTP request method (such as GET, POST, PUT, etc.) 11:21:45 11:21:45 :param url: 11:21:45 The URL to perform the request on. 11:21:45 11:21:45 :param body: 11:21:45 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:45 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:45 11:21:45 :param headers: 11:21:45 Dictionary of custom headers to send, such as User-Agent, 11:21:45 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:45 these headers completely replace any pool-specific headers. 11:21:45 11:21:45 :param retries: 11:21:45 Configure the number of retries to allow before raising a 11:21:45 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:45 11:21:45 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:45 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:45 over different types of retries. 11:21:45 Pass an integer number to retry connection errors that many times, 11:21:45 but no other types of errors. Pass zero to never retry. 11:21:45 11:21:45 If ``False``, then retries are disabled and any exception is raised 11:21:45 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:45 the redirect response will be returned. 11:21:45 11:21:45 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:45 11:21:45 :param redirect: 11:21:45 If True, automatically handle redirects (status codes 301, 302, 11:21:45 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:45 will disable redirect, too. 11:21:45 11:21:45 :param assert_same_host: 11:21:45 If ``True``, will make sure that the host of the pool requests is 11:21:45 consistent else will raise HostChangedError. When ``False``, you can 11:21:45 use the pool on an HTTP proxy and request foreign hosts. 11:21:45 11:21:45 :param timeout: 11:21:45 If specified, overrides the default timeout for this one 11:21:45 request. It may be a float (in seconds) or an instance of 11:21:45 :class:`urllib3.util.Timeout`. 11:21:45 11:21:45 :param pool_timeout: 11:21:45 If set and the pool is set to block=True, then this method will 11:21:45 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:45 connection is available within the time period. 11:21:45 11:21:45 :param bool preload_content: 11:21:45 If True, the response's body will be preloaded into memory. 11:21:45 11:21:45 :param bool decode_content: 11:21:45 If True, will attempt to decode the body based on the 11:21:45 'content-encoding' header. 11:21:45 11:21:45 :param release_conn: 11:21:45 If False, then the urlopen call will not release the connection 11:21:45 back into the pool once a response is received (but will release if 11:21:45 you read the entire contents of the response such as when 11:21:45 `preload_content=True`). This is useful if you're not preloading 11:21:45 the response's content immediately. You will need to call 11:21:45 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:45 back into the pool. If None, it takes the value of ``preload_content`` 11:21:45 which defaults to ``True``. 11:21:45 11:21:45 :param bool chunked: 11:21:45 If True, urllib3 will send the body using chunked transfer 11:21:45 encoding. Otherwise, urllib3 will send the body using the standard 11:21:45 content-length form. Defaults to False. 11:21:45 11:21:45 :param int body_pos: 11:21:45 Position to seek to in file-like body in the event of a retry or 11:21:45 redirect. Typically this won't need to be set because urllib3 will 11:21:45 auto-populate the value when needed. 11:21:45 """ 11:21:45 parsed_url = parse_url(url) 11:21:45 destination_scheme = parsed_url.scheme 11:21:45 11:21:45 if headers is None: 11:21:45 headers = self.headers 11:21:45 11:21:45 if not isinstance(retries, Retry): 11:21:45 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:45 11:21:45 if release_conn is None: 11:21:45 release_conn = preload_content 11:21:45 11:21:45 # Check host 11:21:45 if assert_same_host and not self.is_same_host(url): 11:21:45 raise HostChangedError(self, url, retries) 11:21:45 11:21:45 # Ensure that the URL we're connecting to is properly encoded 11:21:45 if url.startswith("/"): 11:21:45 url = to_str(_encode_target(url)) 11:21:45 else: 11:21:45 url = to_str(parsed_url.url) 11:21:45 11:21:45 conn = None 11:21:45 11:21:45 # Track whether `conn` needs to be released before 11:21:45 # returning/raising/recursing. Update this variable if necessary, and 11:21:45 # leave `release_conn` constant throughout the function. That way, if 11:21:45 # the function recurses, the original value of `release_conn` will be 11:21:45 # passed down into the recursive call, and its value will be respected. 11:21:45 # 11:21:45 # See issue #651 [1] for details. 11:21:45 # 11:21:45 # [1] 11:21:45 release_this_conn = release_conn 11:21:45 11:21:45 http_tunnel_required = connection_requires_http_tunnel( 11:21:45 self.proxy, self.proxy_config, destination_scheme 11:21:45 ) 11:21:45 11:21:45 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:45 # have to copy the headers dict so we can safely change it without those 11:21:45 # changes being reflected in anyone else's copy. 11:21:45 if not http_tunnel_required: 11:21:45 headers = headers.copy() # type: ignore[attr-defined] 11:21:45 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:45 11:21:45 # Must keep the exception bound to a separate variable or else Python 3 11:21:45 # complains about UnboundLocalError. 11:21:45 err = None 11:21:45 11:21:45 # Keep track of whether we cleanly exited the except block. This 11:21:45 # ensures we do proper cleanup in finally. 11:21:45 clean_exit = False 11:21:45 11:21:45 # Rewind body position, if needed. Record current position 11:21:45 # for future rewinds in the event of a redirect/retry. 11:21:45 body_pos = set_file_position(body, body_pos) 11:21:45 11:21:45 try: 11:21:45 # Request a connection from the queue. 11:21:45 timeout_obj = self._get_timeout(timeout) 11:21:45 conn = self._get_conn(timeout=pool_timeout) 11:21:45 11:21:45 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:45 11:21:45 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:45 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:45 try: 11:21:45 self._prepare_proxy(conn) 11:21:45 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:45 self._raise_timeout( 11:21:45 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:45 ) 11:21:45 raise 11:21:45 11:21:45 # If we're going to release the connection in ``finally:``, then 11:21:45 # the response doesn't need to know about the connection. Otherwise 11:21:45 # it will also try to release it and we'll have a double-release 11:21:45 # mess. 11:21:45 response_conn = conn if not release_conn else None 11:21:45 11:21:45 # Make the request on the HTTPConnection object 11:21:45 > response = self._make_request( 11:21:45 conn, 11:21:45 method, 11:21:45 url, 11:21:45 timeout=timeout_obj, 11:21:45 body=body, 11:21:45 headers=headers, 11:21:45 chunked=chunked, 11:21:45 retries=retries, 11:21:45 response_conn=response_conn, 11:21:45 preload_content=preload_content, 11:21:45 decode_content=decode_content, 11:21:45 **response_kw, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:45 conn.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:45 self.endheaders() 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:45 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:45 self.send(msg) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:45 self.connect() 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:45 self.sock = self._new_conn() 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 except socket.gaierror as e: 11:21:45 raise NameResolutionError(self.host, self, e) from e 11:21:45 except SocketTimeout as e: 11:21:45 raise ConnectTimeoutError( 11:21:45 self, 11:21:45 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:45 ) from e 11:21:45 11:21:45 except OSError as e: 11:21:45 > raise NewConnectionError( 11:21:45 self, f"Failed to establish a new connection: {e}" 11:21:45 ) from e 11:21:45 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 > resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:45 retries = retries.increment( 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 method = 'PUT' 11:21:45 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRC01' 11:21:45 response = None 11:21:45 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:45 _pool = 11:21:45 _stacktrace = 11:21:45 11:21:45 def increment( 11:21:45 self, 11:21:45 method: str | None = None, 11:21:45 url: str | None = None, 11:21:45 response: BaseHTTPResponse | None = None, 11:21:45 error: Exception | None = None, 11:21:45 _pool: ConnectionPool | None = None, 11:21:45 _stacktrace: TracebackType | None = None, 11:21:45 ) -> Self: 11:21:45 """Return a new Retry object with incremented retry counters. 11:21:45 11:21:45 :param response: A response object, or None, if the server did not 11:21:45 return a response. 11:21:45 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:45 :param Exception error: An error encountered during the request, or 11:21:45 None if the response was received successfully. 11:21:45 11:21:45 :return: A new ``Retry`` object. 11:21:45 """ 11:21:45 if self.total is False and error: 11:21:45 # Disabled, indicate to re-raise the error. 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 11:21:45 total = self.total 11:21:45 if total is not None: 11:21:45 total -= 1 11:21:45 11:21:45 connect = self.connect 11:21:45 read = self.read 11:21:45 redirect = self.redirect 11:21:45 status_count = self.status 11:21:45 other = self.other 11:21:45 cause = "unknown" 11:21:45 status = None 11:21:45 redirect_location = None 11:21:45 11:21:45 if error and self._is_connection_error(error): 11:21:45 # Connect retry? 11:21:45 if connect is False: 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif connect is not None: 11:21:45 connect -= 1 11:21:45 11:21:45 elif error and self._is_read_error(error): 11:21:45 # Read retry? 11:21:45 if read is False or method is None or not self._is_method_retryable(method): 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif read is not None: 11:21:45 read -= 1 11:21:45 11:21:45 elif error: 11:21:45 # Other retry? 11:21:45 if other is not None: 11:21:45 other -= 1 11:21:45 11:21:45 elif response and response.get_redirect_location(): 11:21:45 # Redirect retry? 11:21:45 if redirect is not None: 11:21:45 redirect -= 1 11:21:45 cause = "too many redirects" 11:21:45 response_redirect_location = response.get_redirect_location() 11:21:45 if response_redirect_location: 11:21:45 redirect_location = response_redirect_location 11:21:45 status = response.status 11:21:45 11:21:45 else: 11:21:45 # Incrementing because of a server error like a 500 in 11:21:45 # status_forcelist and the given method is in the allowed_methods 11:21:45 cause = ResponseError.GENERIC_ERROR 11:21:45 if response and response.status: 11:21:45 if status_count is not None: 11:21:45 status_count -= 1 11:21:45 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:45 status = response.status 11:21:45 11:21:45 history = self.history + ( 11:21:45 RequestHistory(method, url, error, status, redirect_location), 11:21:45 ) 11:21:45 11:21:45 new_retry = self.new( 11:21:45 total=total, 11:21:45 connect=connect, 11:21:45 read=read, 11:21:45 redirect=redirect, 11:21:45 status=status_count, 11:21:45 other=other, 11:21:45 history=history, 11:21:45 ) 11:21:45 11:21:45 if new_retry.is_exhausted(): 11:21:45 reason = error or ResponseError(cause) 11:21:45 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:45 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRC01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:45 11:21:45 During handling of the above exception, another exception occurred: 11:21:45 11:21:45 self = 11:21:45 11:21:45 def test_02_connect_xpdrC(self): 11:21:45 > response = test_utils.mount_device("XPDRC01", ('xpdrc', self.NODE_VERSION)) 11:21:45 11:21:45 transportpce_tests/1.2.1/test06_end2end.py:98: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 transportpce_tests/common/test_utils.py:343: in mount_device 11:21:45 response = put_request(url[RESTCONF_VERSION].format('{}', node), body) 11:21:45 transportpce_tests/common/test_utils.py:124: in put_request 11:21:45 return requests.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:45 return session.request(method=method, url=url, **kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:45 resp = self.send(prep, **send_kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:45 r = adapter.send(request, **kwargs) 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 except (ProtocolError, OSError) as err: 11:21:45 raise ConnectionError(err, request=request) 11:21:45 11:21:45 except MaxRetryError as e: 11:21:45 if isinstance(e.reason, ConnectTimeoutError): 11:21:45 # TODO: Remove this in 3.0.0: see #2811 11:21:45 if not isinstance(e.reason, NewConnectionError): 11:21:45 raise ConnectTimeout(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, ResponseError): 11:21:45 raise RetryError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _ProxyError): 11:21:45 raise ProxyError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _SSLError): 11:21:45 # This branch is for urllib3 v1.22 and later. 11:21:45 raise SSLError(e, request=request) 11:21:45 11:21:45 > raise ConnectionError(e, request=request) 11:21:45 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRC01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:45 ----------------------------- Captured stdout call ----------------------------- 11:21:45 execution of test_02_connect_xpdrC 11:21:45 _________________ TransportPCEFulltesting.test_03_connect_rdmA _________________ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 > sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:45 raise err 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:45 socket_options = [(6, 1, 1)] 11:21:45 11:21:45 def create_connection( 11:21:45 address: tuple[str, int], 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 source_address: tuple[str, int] | None = None, 11:21:45 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:45 ) -> socket.socket: 11:21:45 """Connect to *address* and return the socket object. 11:21:45 11:21:45 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:45 port)``) and return the socket object. Passing the optional 11:21:45 *timeout* parameter will set the timeout on the socket instance 11:21:45 before attempting to connect. If no *timeout* is supplied, the 11:21:45 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:45 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:45 for the socket to bind as a source address before making the connection. 11:21:45 An host of '' or port 0 tells the OS to use the default. 11:21:45 """ 11:21:45 11:21:45 host, port = address 11:21:45 if host.startswith("["): 11:21:45 host = host.strip("[]") 11:21:45 err = None 11:21:45 11:21:45 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:45 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:45 # The original create_connection function always returns all records. 11:21:45 family = allowed_gai_family() 11:21:45 11:21:45 try: 11:21:45 host.encode("idna") 11:21:45 except UnicodeError: 11:21:45 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:45 11:21:45 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:45 af, socktype, proto, canonname, sa = res 11:21:45 sock = None 11:21:45 try: 11:21:45 sock = socket.socket(af, socktype, proto) 11:21:45 11:21:45 # If provided, set socket level options before connecting. 11:21:45 _set_socket_options(sock, socket_options) 11:21:45 11:21:45 if timeout is not _DEFAULT_TIMEOUT: 11:21:45 sock.settimeout(timeout) 11:21:45 if source_address: 11:21:45 sock.bind(source_address) 11:21:45 > sock.connect(sa) 11:21:45 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 method = 'PUT' 11:21:45 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01' 11:21:45 body = '{"node": [{"node-id": "ROADMA01", "netconf-node-topology:netconf-node": {"netconf-node-topology:host": "127.0.0.1", "...ff-millis": 1800000, "netconf-node-topology:backoff-multiplier": 1.5, "netconf-node-topology:keepalive-delay": 120}}]}' 11:21:45 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '710', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:45 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 redirect = False, assert_same_host = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:45 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:45 decode_content = False, response_kw = {} 11:21:45 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01', query=None, fragment=None) 11:21:45 destination_scheme = None, conn = None, release_this_conn = True 11:21:45 http_tunnel_required = False, err = None, clean_exit = False 11:21:45 11:21:45 def urlopen( # type: ignore[override] 11:21:45 self, 11:21:45 method: str, 11:21:45 url: str, 11:21:45 body: _TYPE_BODY | None = None, 11:21:45 headers: typing.Mapping[str, str] | None = None, 11:21:45 retries: Retry | bool | int | None = None, 11:21:45 redirect: bool = True, 11:21:45 assert_same_host: bool = True, 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 pool_timeout: int | None = None, 11:21:45 release_conn: bool | None = None, 11:21:45 chunked: bool = False, 11:21:45 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:45 preload_content: bool = True, 11:21:45 decode_content: bool = True, 11:21:45 **response_kw: typing.Any, 11:21:45 ) -> BaseHTTPResponse: 11:21:45 """ 11:21:45 Get a connection from the pool and perform an HTTP request. This is the 11:21:45 lowest level call for making a request, so you'll need to specify all 11:21:45 the raw details. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 More commonly, it's appropriate to use a convenience method 11:21:45 such as :meth:`request`. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 `release_conn` will only behave as expected if 11:21:45 `preload_content=False` because we want to make 11:21:45 `preload_content=False` the default behaviour someday soon without 11:21:45 breaking backwards compatibility. 11:21:45 11:21:45 :param method: 11:21:45 HTTP request method (such as GET, POST, PUT, etc.) 11:21:45 11:21:45 :param url: 11:21:45 The URL to perform the request on. 11:21:45 11:21:45 :param body: 11:21:45 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:45 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:45 11:21:45 :param headers: 11:21:45 Dictionary of custom headers to send, such as User-Agent, 11:21:45 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:45 these headers completely replace any pool-specific headers. 11:21:45 11:21:45 :param retries: 11:21:45 Configure the number of retries to allow before raising a 11:21:45 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:45 11:21:45 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:45 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:45 over different types of retries. 11:21:45 Pass an integer number to retry connection errors that many times, 11:21:45 but no other types of errors. Pass zero to never retry. 11:21:45 11:21:45 If ``False``, then retries are disabled and any exception is raised 11:21:45 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:45 the redirect response will be returned. 11:21:45 11:21:45 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:45 11:21:45 :param redirect: 11:21:45 If True, automatically handle redirects (status codes 301, 302, 11:21:45 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:45 will disable redirect, too. 11:21:45 11:21:45 :param assert_same_host: 11:21:45 If ``True``, will make sure that the host of the pool requests is 11:21:45 consistent else will raise HostChangedError. When ``False``, you can 11:21:45 use the pool on an HTTP proxy and request foreign hosts. 11:21:45 11:21:45 :param timeout: 11:21:45 If specified, overrides the default timeout for this one 11:21:45 request. It may be a float (in seconds) or an instance of 11:21:45 :class:`urllib3.util.Timeout`. 11:21:45 11:21:45 :param pool_timeout: 11:21:45 If set and the pool is set to block=True, then this method will 11:21:45 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:45 connection is available within the time period. 11:21:45 11:21:45 :param bool preload_content: 11:21:45 If True, the response's body will be preloaded into memory. 11:21:45 11:21:45 :param bool decode_content: 11:21:45 If True, will attempt to decode the body based on the 11:21:45 'content-encoding' header. 11:21:45 11:21:45 :param release_conn: 11:21:45 If False, then the urlopen call will not release the connection 11:21:45 back into the pool once a response is received (but will release if 11:21:45 you read the entire contents of the response such as when 11:21:45 `preload_content=True`). This is useful if you're not preloading 11:21:45 the response's content immediately. You will need to call 11:21:45 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:45 back into the pool. If None, it takes the value of ``preload_content`` 11:21:45 which defaults to ``True``. 11:21:45 11:21:45 :param bool chunked: 11:21:45 If True, urllib3 will send the body using chunked transfer 11:21:45 encoding. Otherwise, urllib3 will send the body using the standard 11:21:45 content-length form. Defaults to False. 11:21:45 11:21:45 :param int body_pos: 11:21:45 Position to seek to in file-like body in the event of a retry or 11:21:45 redirect. Typically this won't need to be set because urllib3 will 11:21:45 auto-populate the value when needed. 11:21:45 """ 11:21:45 parsed_url = parse_url(url) 11:21:45 destination_scheme = parsed_url.scheme 11:21:45 11:21:45 if headers is None: 11:21:45 headers = self.headers 11:21:45 11:21:45 if not isinstance(retries, Retry): 11:21:45 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:45 11:21:45 if release_conn is None: 11:21:45 release_conn = preload_content 11:21:45 11:21:45 # Check host 11:21:45 if assert_same_host and not self.is_same_host(url): 11:21:45 raise HostChangedError(self, url, retries) 11:21:45 11:21:45 # Ensure that the URL we're connecting to is properly encoded 11:21:45 if url.startswith("/"): 11:21:45 url = to_str(_encode_target(url)) 11:21:45 else: 11:21:45 url = to_str(parsed_url.url) 11:21:45 11:21:45 conn = None 11:21:45 11:21:45 # Track whether `conn` needs to be released before 11:21:45 # returning/raising/recursing. Update this variable if necessary, and 11:21:45 # leave `release_conn` constant throughout the function. That way, if 11:21:45 # the function recurses, the original value of `release_conn` will be 11:21:45 # passed down into the recursive call, and its value will be respected. 11:21:45 # 11:21:45 # See issue #651 [1] for details. 11:21:45 # 11:21:45 # [1] 11:21:45 release_this_conn = release_conn 11:21:45 11:21:45 http_tunnel_required = connection_requires_http_tunnel( 11:21:45 self.proxy, self.proxy_config, destination_scheme 11:21:45 ) 11:21:45 11:21:45 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:45 # have to copy the headers dict so we can safely change it without those 11:21:45 # changes being reflected in anyone else's copy. 11:21:45 if not http_tunnel_required: 11:21:45 headers = headers.copy() # type: ignore[attr-defined] 11:21:45 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:45 11:21:45 # Must keep the exception bound to a separate variable or else Python 3 11:21:45 # complains about UnboundLocalError. 11:21:45 err = None 11:21:45 11:21:45 # Keep track of whether we cleanly exited the except block. This 11:21:45 # ensures we do proper cleanup in finally. 11:21:45 clean_exit = False 11:21:45 11:21:45 # Rewind body position, if needed. Record current position 11:21:45 # for future rewinds in the event of a redirect/retry. 11:21:45 body_pos = set_file_position(body, body_pos) 11:21:45 11:21:45 try: 11:21:45 # Request a connection from the queue. 11:21:45 timeout_obj = self._get_timeout(timeout) 11:21:45 conn = self._get_conn(timeout=pool_timeout) 11:21:45 11:21:45 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:45 11:21:45 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:45 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:45 try: 11:21:45 self._prepare_proxy(conn) 11:21:45 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:45 self._raise_timeout( 11:21:45 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:45 ) 11:21:45 raise 11:21:45 11:21:45 # If we're going to release the connection in ``finally:``, then 11:21:45 # the response doesn't need to know about the connection. Otherwise 11:21:45 # it will also try to release it and we'll have a double-release 11:21:45 # mess. 11:21:45 response_conn = conn if not release_conn else None 11:21:45 11:21:45 # Make the request on the HTTPConnection object 11:21:45 > response = self._make_request( 11:21:45 conn, 11:21:45 method, 11:21:45 url, 11:21:45 timeout=timeout_obj, 11:21:45 body=body, 11:21:45 headers=headers, 11:21:45 chunked=chunked, 11:21:45 retries=retries, 11:21:45 response_conn=response_conn, 11:21:45 preload_content=preload_content, 11:21:45 decode_content=decode_content, 11:21:45 **response_kw, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:45 conn.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:45 self.endheaders() 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:45 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:45 self.send(msg) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:45 self.connect() 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:45 self.sock = self._new_conn() 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 except socket.gaierror as e: 11:21:45 raise NameResolutionError(self.host, self, e) from e 11:21:45 except SocketTimeout as e: 11:21:45 raise ConnectTimeoutError( 11:21:45 self, 11:21:45 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:45 ) from e 11:21:45 11:21:45 except OSError as e: 11:21:45 > raise NewConnectionError( 11:21:45 self, f"Failed to establish a new connection: {e}" 11:21:45 ) from e 11:21:45 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 > resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:45 retries = retries.increment( 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 method = 'PUT' 11:21:45 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01' 11:21:45 response = None 11:21:45 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:45 _pool = 11:21:45 _stacktrace = 11:21:45 11:21:45 def increment( 11:21:45 self, 11:21:45 method: str | None = None, 11:21:45 url: str | None = None, 11:21:45 response: BaseHTTPResponse | None = None, 11:21:45 error: Exception | None = None, 11:21:45 _pool: ConnectionPool | None = None, 11:21:45 _stacktrace: TracebackType | None = None, 11:21:45 ) -> Self: 11:21:45 """Return a new Retry object with incremented retry counters. 11:21:45 11:21:45 :param response: A response object, or None, if the server did not 11:21:45 return a response. 11:21:45 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:45 :param Exception error: An error encountered during the request, or 11:21:45 None if the response was received successfully. 11:21:45 11:21:45 :return: A new ``Retry`` object. 11:21:45 """ 11:21:45 if self.total is False and error: 11:21:45 # Disabled, indicate to re-raise the error. 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 11:21:45 total = self.total 11:21:45 if total is not None: 11:21:45 total -= 1 11:21:45 11:21:45 connect = self.connect 11:21:45 read = self.read 11:21:45 redirect = self.redirect 11:21:45 status_count = self.status 11:21:45 other = self.other 11:21:45 cause = "unknown" 11:21:45 status = None 11:21:45 redirect_location = None 11:21:45 11:21:45 if error and self._is_connection_error(error): 11:21:45 # Connect retry? 11:21:45 if connect is False: 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif connect is not None: 11:21:45 connect -= 1 11:21:45 11:21:45 elif error and self._is_read_error(error): 11:21:45 # Read retry? 11:21:45 if read is False or method is None or not self._is_method_retryable(method): 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif read is not None: 11:21:45 read -= 1 11:21:45 11:21:45 elif error: 11:21:45 # Other retry? 11:21:45 if other is not None: 11:21:45 other -= 1 11:21:45 11:21:45 elif response and response.get_redirect_location(): 11:21:45 # Redirect retry? 11:21:45 if redirect is not None: 11:21:45 redirect -= 1 11:21:45 cause = "too many redirects" 11:21:45 response_redirect_location = response.get_redirect_location() 11:21:45 if response_redirect_location: 11:21:45 redirect_location = response_redirect_location 11:21:45 status = response.status 11:21:45 11:21:45 else: 11:21:45 # Incrementing because of a server error like a 500 in 11:21:45 # status_forcelist and the given method is in the allowed_methods 11:21:45 cause = ResponseError.GENERIC_ERROR 11:21:45 if response and response.status: 11:21:45 if status_count is not None: 11:21:45 status_count -= 1 11:21:45 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:45 status = response.status 11:21:45 11:21:45 history = self.history + ( 11:21:45 RequestHistory(method, url, error, status, redirect_location), 11:21:45 ) 11:21:45 11:21:45 new_retry = self.new( 11:21:45 total=total, 11:21:45 connect=connect, 11:21:45 read=read, 11:21:45 redirect=redirect, 11:21:45 status=status_count, 11:21:45 other=other, 11:21:45 history=history, 11:21:45 ) 11:21:45 11:21:45 if new_retry.is_exhausted(): 11:21:45 reason = error or ResponseError(cause) 11:21:45 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:45 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:45 11:21:45 During handling of the above exception, another exception occurred: 11:21:45 11:21:45 self = 11:21:45 11:21:45 def test_03_connect_rdmA(self): 11:21:45 > response = test_utils.mount_device("ROADMA01", ('roadma-full', self.NODE_VERSION)) 11:21:45 11:21:45 transportpce_tests/1.2.1/test06_end2end.py:102: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 transportpce_tests/common/test_utils.py:343: in mount_device 11:21:45 response = put_request(url[RESTCONF_VERSION].format('{}', node), body) 11:21:45 transportpce_tests/common/test_utils.py:124: in put_request 11:21:45 return requests.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:45 return session.request(method=method, url=url, **kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:45 resp = self.send(prep, **send_kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:45 r = adapter.send(request, **kwargs) 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 except (ProtocolError, OSError) as err: 11:21:45 raise ConnectionError(err, request=request) 11:21:45 11:21:45 except MaxRetryError as e: 11:21:45 if isinstance(e.reason, ConnectTimeoutError): 11:21:45 # TODO: Remove this in 3.0.0: see #2811 11:21:45 if not isinstance(e.reason, NewConnectionError): 11:21:45 raise ConnectTimeout(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, ResponseError): 11:21:45 raise RetryError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _ProxyError): 11:21:45 raise ProxyError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _SSLError): 11:21:45 # This branch is for urllib3 v1.22 and later. 11:21:45 raise SSLError(e, request=request) 11:21:45 11:21:45 > raise ConnectionError(e, request=request) 11:21:45 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:45 ----------------------------- Captured stdout call ----------------------------- 11:21:45 execution of test_03_connect_rdmA 11:21:45 _________________ TransportPCEFulltesting.test_04_connect_rdmC _________________ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 > sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:45 raise err 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:45 socket_options = [(6, 1, 1)] 11:21:45 11:21:45 def create_connection( 11:21:45 address: tuple[str, int], 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 source_address: tuple[str, int] | None = None, 11:21:45 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:45 ) -> socket.socket: 11:21:45 """Connect to *address* and return the socket object. 11:21:45 11:21:45 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:45 port)``) and return the socket object. Passing the optional 11:21:45 *timeout* parameter will set the timeout on the socket instance 11:21:45 before attempting to connect. If no *timeout* is supplied, the 11:21:45 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:45 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:45 for the socket to bind as a source address before making the connection. 11:21:45 An host of '' or port 0 tells the OS to use the default. 11:21:45 """ 11:21:45 11:21:45 host, port = address 11:21:45 if host.startswith("["): 11:21:45 host = host.strip("[]") 11:21:45 err = None 11:21:45 11:21:45 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:45 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:45 # The original create_connection function always returns all records. 11:21:45 family = allowed_gai_family() 11:21:45 11:21:45 try: 11:21:45 host.encode("idna") 11:21:45 except UnicodeError: 11:21:45 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:45 11:21:45 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:45 af, socktype, proto, canonname, sa = res 11:21:45 sock = None 11:21:45 try: 11:21:45 sock = socket.socket(af, socktype, proto) 11:21:45 11:21:45 # If provided, set socket level options before connecting. 11:21:45 _set_socket_options(sock, socket_options) 11:21:45 11:21:45 if timeout is not _DEFAULT_TIMEOUT: 11:21:45 sock.settimeout(timeout) 11:21:45 if source_address: 11:21:45 sock.bind(source_address) 11:21:45 > sock.connect(sa) 11:21:45 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 method = 'PUT' 11:21:45 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01' 11:21:45 body = '{"node": [{"node-id": "ROADMC01", "netconf-node-topology:netconf-node": {"netconf-node-topology:host": "127.0.0.1", "...ff-millis": 1800000, "netconf-node-topology:backoff-multiplier": 1.5, "netconf-node-topology:keepalive-delay": 120}}]}' 11:21:45 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '710', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:45 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 redirect = False, assert_same_host = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:45 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:45 decode_content = False, response_kw = {} 11:21:45 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01', query=None, fragment=None) 11:21:45 destination_scheme = None, conn = None, release_this_conn = True 11:21:45 http_tunnel_required = False, err = None, clean_exit = False 11:21:45 11:21:45 def urlopen( # type: ignore[override] 11:21:45 self, 11:21:45 method: str, 11:21:45 url: str, 11:21:45 body: _TYPE_BODY | None = None, 11:21:45 headers: typing.Mapping[str, str] | None = None, 11:21:45 retries: Retry | bool | int | None = None, 11:21:45 redirect: bool = True, 11:21:45 assert_same_host: bool = True, 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 pool_timeout: int | None = None, 11:21:45 release_conn: bool | None = None, 11:21:45 chunked: bool = False, 11:21:45 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:45 preload_content: bool = True, 11:21:45 decode_content: bool = True, 11:21:45 **response_kw: typing.Any, 11:21:45 ) -> BaseHTTPResponse: 11:21:45 """ 11:21:45 Get a connection from the pool and perform an HTTP request. This is the 11:21:45 lowest level call for making a request, so you'll need to specify all 11:21:45 the raw details. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 More commonly, it's appropriate to use a convenience method 11:21:45 such as :meth:`request`. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 `release_conn` will only behave as expected if 11:21:45 `preload_content=False` because we want to make 11:21:45 `preload_content=False` the default behaviour someday soon without 11:21:45 breaking backwards compatibility. 11:21:45 11:21:45 :param method: 11:21:45 HTTP request method (such as GET, POST, PUT, etc.) 11:21:45 11:21:45 :param url: 11:21:45 The URL to perform the request on. 11:21:45 11:21:45 :param body: 11:21:45 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:45 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:45 11:21:45 :param headers: 11:21:45 Dictionary of custom headers to send, such as User-Agent, 11:21:45 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:45 these headers completely replace any pool-specific headers. 11:21:45 11:21:45 :param retries: 11:21:45 Configure the number of retries to allow before raising a 11:21:45 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:45 11:21:45 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:45 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:45 over different types of retries. 11:21:45 Pass an integer number to retry connection errors that many times, 11:21:45 but no other types of errors. Pass zero to never retry. 11:21:45 11:21:45 If ``False``, then retries are disabled and any exception is raised 11:21:45 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:45 the redirect response will be returned. 11:21:45 11:21:45 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:45 11:21:45 :param redirect: 11:21:45 If True, automatically handle redirects (status codes 301, 302, 11:21:45 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:45 will disable redirect, too. 11:21:45 11:21:45 :param assert_same_host: 11:21:45 If ``True``, will make sure that the host of the pool requests is 11:21:45 consistent else will raise HostChangedError. When ``False``, you can 11:21:45 use the pool on an HTTP proxy and request foreign hosts. 11:21:45 11:21:45 :param timeout: 11:21:45 If specified, overrides the default timeout for this one 11:21:45 request. It may be a float (in seconds) or an instance of 11:21:45 :class:`urllib3.util.Timeout`. 11:21:45 11:21:45 :param pool_timeout: 11:21:45 If set and the pool is set to block=True, then this method will 11:21:45 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:45 connection is available within the time period. 11:21:45 11:21:45 :param bool preload_content: 11:21:45 If True, the response's body will be preloaded into memory. 11:21:45 11:21:45 :param bool decode_content: 11:21:45 If True, will attempt to decode the body based on the 11:21:45 'content-encoding' header. 11:21:45 11:21:45 :param release_conn: 11:21:45 If False, then the urlopen call will not release the connection 11:21:45 back into the pool once a response is received (but will release if 11:21:45 you read the entire contents of the response such as when 11:21:45 `preload_content=True`). This is useful if you're not preloading 11:21:45 the response's content immediately. You will need to call 11:21:45 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:45 back into the pool. If None, it takes the value of ``preload_content`` 11:21:45 which defaults to ``True``. 11:21:45 11:21:45 :param bool chunked: 11:21:45 If True, urllib3 will send the body using chunked transfer 11:21:45 encoding. Otherwise, urllib3 will send the body using the standard 11:21:45 content-length form. Defaults to False. 11:21:45 11:21:45 :param int body_pos: 11:21:45 Position to seek to in file-like body in the event of a retry or 11:21:45 redirect. Typically this won't need to be set because urllib3 will 11:21:45 auto-populate the value when needed. 11:21:45 """ 11:21:45 parsed_url = parse_url(url) 11:21:45 destination_scheme = parsed_url.scheme 11:21:45 11:21:45 if headers is None: 11:21:45 headers = self.headers 11:21:45 11:21:45 if not isinstance(retries, Retry): 11:21:45 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:45 11:21:45 if release_conn is None: 11:21:45 release_conn = preload_content 11:21:45 11:21:45 # Check host 11:21:45 if assert_same_host and not self.is_same_host(url): 11:21:45 raise HostChangedError(self, url, retries) 11:21:45 11:21:45 # Ensure that the URL we're connecting to is properly encoded 11:21:45 if url.startswith("/"): 11:21:45 url = to_str(_encode_target(url)) 11:21:45 else: 11:21:45 url = to_str(parsed_url.url) 11:21:45 11:21:45 conn = None 11:21:45 11:21:45 # Track whether `conn` needs to be released before 11:21:45 # returning/raising/recursing. Update this variable if necessary, and 11:21:45 # leave `release_conn` constant throughout the function. That way, if 11:21:45 # the function recurses, the original value of `release_conn` will be 11:21:45 # passed down into the recursive call, and its value will be respected. 11:21:45 # 11:21:45 # See issue #651 [1] for details. 11:21:45 # 11:21:45 # [1] 11:21:45 release_this_conn = release_conn 11:21:45 11:21:45 http_tunnel_required = connection_requires_http_tunnel( 11:21:45 self.proxy, self.proxy_config, destination_scheme 11:21:45 ) 11:21:45 11:21:45 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:45 # have to copy the headers dict so we can safely change it without those 11:21:45 # changes being reflected in anyone else's copy. 11:21:45 if not http_tunnel_required: 11:21:45 headers = headers.copy() # type: ignore[attr-defined] 11:21:45 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:45 11:21:45 # Must keep the exception bound to a separate variable or else Python 3 11:21:45 # complains about UnboundLocalError. 11:21:45 err = None 11:21:45 11:21:45 # Keep track of whether we cleanly exited the except block. This 11:21:45 # ensures we do proper cleanup in finally. 11:21:45 clean_exit = False 11:21:45 11:21:45 # Rewind body position, if needed. Record current position 11:21:45 # for future rewinds in the event of a redirect/retry. 11:21:45 body_pos = set_file_position(body, body_pos) 11:21:45 11:21:45 try: 11:21:45 # Request a connection from the queue. 11:21:45 timeout_obj = self._get_timeout(timeout) 11:21:45 conn = self._get_conn(timeout=pool_timeout) 11:21:45 11:21:45 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:45 11:21:45 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:45 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:45 try: 11:21:45 self._prepare_proxy(conn) 11:21:45 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:45 self._raise_timeout( 11:21:45 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:45 ) 11:21:45 raise 11:21:45 11:21:45 # If we're going to release the connection in ``finally:``, then 11:21:45 # the response doesn't need to know about the connection. Otherwise 11:21:45 # it will also try to release it and we'll have a double-release 11:21:45 # mess. 11:21:45 response_conn = conn if not release_conn else None 11:21:45 11:21:45 # Make the request on the HTTPConnection object 11:21:45 > response = self._make_request( 11:21:45 conn, 11:21:45 method, 11:21:45 url, 11:21:45 timeout=timeout_obj, 11:21:45 body=body, 11:21:45 headers=headers, 11:21:45 chunked=chunked, 11:21:45 retries=retries, 11:21:45 response_conn=response_conn, 11:21:45 preload_content=preload_content, 11:21:45 decode_content=decode_content, 11:21:45 **response_kw, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:45 conn.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:45 self.endheaders() 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:45 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:45 self.send(msg) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:45 self.connect() 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:45 self.sock = self._new_conn() 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 except socket.gaierror as e: 11:21:45 raise NameResolutionError(self.host, self, e) from e 11:21:45 except SocketTimeout as e: 11:21:45 raise ConnectTimeoutError( 11:21:45 self, 11:21:45 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:45 ) from e 11:21:45 11:21:45 except OSError as e: 11:21:45 > raise NewConnectionError( 11:21:45 self, f"Failed to establish a new connection: {e}" 11:21:45 ) from e 11:21:45 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 > resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:45 retries = retries.increment( 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 method = 'PUT' 11:21:45 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01' 11:21:45 response = None 11:21:45 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:45 _pool = 11:21:45 _stacktrace = 11:21:45 11:21:45 def increment( 11:21:45 self, 11:21:45 method: str | None = None, 11:21:45 url: str | None = None, 11:21:45 response: BaseHTTPResponse | None = None, 11:21:45 error: Exception | None = None, 11:21:45 _pool: ConnectionPool | None = None, 11:21:45 _stacktrace: TracebackType | None = None, 11:21:45 ) -> Self: 11:21:45 """Return a new Retry object with incremented retry counters. 11:21:45 11:21:45 :param response: A response object, or None, if the server did not 11:21:45 return a response. 11:21:45 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:45 :param Exception error: An error encountered during the request, or 11:21:45 None if the response was received successfully. 11:21:45 11:21:45 :return: A new ``Retry`` object. 11:21:45 """ 11:21:45 if self.total is False and error: 11:21:45 # Disabled, indicate to re-raise the error. 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 11:21:45 total = self.total 11:21:45 if total is not None: 11:21:45 total -= 1 11:21:45 11:21:45 connect = self.connect 11:21:45 read = self.read 11:21:45 redirect = self.redirect 11:21:45 status_count = self.status 11:21:45 other = self.other 11:21:45 cause = "unknown" 11:21:45 status = None 11:21:45 redirect_location = None 11:21:45 11:21:45 if error and self._is_connection_error(error): 11:21:45 # Connect retry? 11:21:45 if connect is False: 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif connect is not None: 11:21:45 connect -= 1 11:21:45 11:21:45 elif error and self._is_read_error(error): 11:21:45 # Read retry? 11:21:45 if read is False or method is None or not self._is_method_retryable(method): 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif read is not None: 11:21:45 read -= 1 11:21:45 11:21:45 elif error: 11:21:45 # Other retry? 11:21:45 if other is not None: 11:21:45 other -= 1 11:21:45 11:21:45 elif response and response.get_redirect_location(): 11:21:45 # Redirect retry? 11:21:45 if redirect is not None: 11:21:45 redirect -= 1 11:21:45 cause = "too many redirects" 11:21:45 response_redirect_location = response.get_redirect_location() 11:21:45 if response_redirect_location: 11:21:45 redirect_location = response_redirect_location 11:21:45 status = response.status 11:21:45 11:21:45 else: 11:21:45 # Incrementing because of a server error like a 500 in 11:21:45 # status_forcelist and the given method is in the allowed_methods 11:21:45 cause = ResponseError.GENERIC_ERROR 11:21:45 if response and response.status: 11:21:45 if status_count is not None: 11:21:45 status_count -= 1 11:21:45 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:45 status = response.status 11:21:45 11:21:45 history = self.history + ( 11:21:45 RequestHistory(method, url, error, status, redirect_location), 11:21:45 ) 11:21:45 11:21:45 new_retry = self.new( 11:21:45 total=total, 11:21:45 connect=connect, 11:21:45 read=read, 11:21:45 redirect=redirect, 11:21:45 status=status_count, 11:21:45 other=other, 11:21:45 history=history, 11:21:45 ) 11:21:45 11:21:45 if new_retry.is_exhausted(): 11:21:45 reason = error or ResponseError(cause) 11:21:45 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:45 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:45 11:21:45 During handling of the above exception, another exception occurred: 11:21:45 11:21:45 self = 11:21:45 11:21:45 def test_04_connect_rdmC(self): 11:21:45 > response = test_utils.mount_device("ROADMC01", ('roadmc-full', self.NODE_VERSION)) 11:21:45 11:21:45 transportpce_tests/1.2.1/test06_end2end.py:106: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 transportpce_tests/common/test_utils.py:343: in mount_device 11:21:45 response = put_request(url[RESTCONF_VERSION].format('{}', node), body) 11:21:45 transportpce_tests/common/test_utils.py:124: in put_request 11:21:45 return requests.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:45 return session.request(method=method, url=url, **kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:45 resp = self.send(prep, **send_kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:45 r = adapter.send(request, **kwargs) 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 except (ProtocolError, OSError) as err: 11:21:45 raise ConnectionError(err, request=request) 11:21:45 11:21:45 except MaxRetryError as e: 11:21:45 if isinstance(e.reason, ConnectTimeoutError): 11:21:45 # TODO: Remove this in 3.0.0: see #2811 11:21:45 if not isinstance(e.reason, NewConnectionError): 11:21:45 raise ConnectTimeout(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, ResponseError): 11:21:45 raise RetryError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _ProxyError): 11:21:45 raise ProxyError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _SSLError): 11:21:45 # This branch is for urllib3 v1.22 and later. 11:21:45 raise SSLError(e, request=request) 11:21:45 11:21:45 > raise ConnectionError(e, request=request) 11:21:45 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:45 ----------------------------- Captured stdout call ----------------------------- 11:21:45 execution of test_04_connect_rdmC 11:21:45 ________ TransportPCEFulltesting.test_05_connect_xpdrA_N1_to_roadmA_PP1 ________ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 > sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:45 raise err 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:45 socket_options = [(6, 1, 1)] 11:21:45 11:21:45 def create_connection( 11:21:45 address: tuple[str, int], 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 source_address: tuple[str, int] | None = None, 11:21:45 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:45 ) -> socket.socket: 11:21:45 """Connect to *address* and return the socket object. 11:21:45 11:21:45 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:45 port)``) and return the socket object. Passing the optional 11:21:45 *timeout* parameter will set the timeout on the socket instance 11:21:45 before attempting to connect. If no *timeout* is supplied, the 11:21:45 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:45 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:45 for the socket to bind as a source address before making the connection. 11:21:45 An host of '' or port 0 tells the OS to use the default. 11:21:45 """ 11:21:45 11:21:45 host, port = address 11:21:45 if host.startswith("["): 11:21:45 host = host.strip("[]") 11:21:45 err = None 11:21:45 11:21:45 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:45 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:45 # The original create_connection function always returns all records. 11:21:45 family = allowed_gai_family() 11:21:45 11:21:45 try: 11:21:45 host.encode("idna") 11:21:45 except UnicodeError: 11:21:45 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:45 11:21:45 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:45 af, socktype, proto, canonname, sa = res 11:21:45 sock = None 11:21:45 try: 11:21:45 sock = socket.socket(af, socktype, proto) 11:21:45 11:21:45 # If provided, set socket level options before connecting. 11:21:45 _set_socket_options(sock, socket_options) 11:21:45 11:21:45 if timeout is not _DEFAULT_TIMEOUT: 11:21:45 sock.settimeout(timeout) 11:21:45 if source_address: 11:21:45 sock.bind(source_address) 11:21:45 > sock.connect(sa) 11:21:45 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 method = 'POST' 11:21:45 url = '/rests/operations/transportpce-networkutils:init-xpdr-rdm-links' 11:21:45 body = '{"input": {"links-input": {"xpdr-node": "XPDRA01", "xpdr-num": "1", "network-num": "1", "rdm-node": "ROADMA01", "srg-num": "1", "termination-point-num": "SRG1-PP1-TXRX"}}}' 11:21:45 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '171', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:45 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 redirect = False, assert_same_host = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:45 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:45 decode_content = False, response_kw = {} 11:21:45 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-networkutils:init-xpdr-rdm-links', query=None, fragment=None) 11:21:45 destination_scheme = None, conn = None, release_this_conn = True 11:21:45 http_tunnel_required = False, err = None, clean_exit = False 11:21:45 11:21:45 def urlopen( # type: ignore[override] 11:21:45 self, 11:21:45 method: str, 11:21:45 url: str, 11:21:45 body: _TYPE_BODY | None = None, 11:21:45 headers: typing.Mapping[str, str] | None = None, 11:21:45 retries: Retry | bool | int | None = None, 11:21:45 redirect: bool = True, 11:21:45 assert_same_host: bool = True, 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 pool_timeout: int | None = None, 11:21:45 release_conn: bool | None = None, 11:21:45 chunked: bool = False, 11:21:45 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:45 preload_content: bool = True, 11:21:45 decode_content: bool = True, 11:21:45 **response_kw: typing.Any, 11:21:45 ) -> BaseHTTPResponse: 11:21:45 """ 11:21:45 Get a connection from the pool and perform an HTTP request. This is the 11:21:45 lowest level call for making a request, so you'll need to specify all 11:21:45 the raw details. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 More commonly, it's appropriate to use a convenience method 11:21:45 such as :meth:`request`. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 `release_conn` will only behave as expected if 11:21:45 `preload_content=False` because we want to make 11:21:45 `preload_content=False` the default behaviour someday soon without 11:21:45 breaking backwards compatibility. 11:21:45 11:21:45 :param method: 11:21:45 HTTP request method (such as GET, POST, PUT, etc.) 11:21:45 11:21:45 :param url: 11:21:45 The URL to perform the request on. 11:21:45 11:21:45 :param body: 11:21:45 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:45 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:45 11:21:45 :param headers: 11:21:45 Dictionary of custom headers to send, such as User-Agent, 11:21:45 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:45 these headers completely replace any pool-specific headers. 11:21:45 11:21:45 :param retries: 11:21:45 Configure the number of retries to allow before raising a 11:21:45 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:45 11:21:45 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:45 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:45 over different types of retries. 11:21:45 Pass an integer number to retry connection errors that many times, 11:21:45 but no other types of errors. Pass zero to never retry. 11:21:45 11:21:45 If ``False``, then retries are disabled and any exception is raised 11:21:45 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:45 the redirect response will be returned. 11:21:45 11:21:45 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:45 11:21:45 :param redirect: 11:21:45 If True, automatically handle redirects (status codes 301, 302, 11:21:45 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:45 will disable redirect, too. 11:21:45 11:21:45 :param assert_same_host: 11:21:45 If ``True``, will make sure that the host of the pool requests is 11:21:45 consistent else will raise HostChangedError. When ``False``, you can 11:21:45 use the pool on an HTTP proxy and request foreign hosts. 11:21:45 11:21:45 :param timeout: 11:21:45 If specified, overrides the default timeout for this one 11:21:45 request. It may be a float (in seconds) or an instance of 11:21:45 :class:`urllib3.util.Timeout`. 11:21:45 11:21:45 :param pool_timeout: 11:21:45 If set and the pool is set to block=True, then this method will 11:21:45 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:45 connection is available within the time period. 11:21:45 11:21:45 :param bool preload_content: 11:21:45 If True, the response's body will be preloaded into memory. 11:21:45 11:21:45 :param bool decode_content: 11:21:45 If True, will attempt to decode the body based on the 11:21:45 'content-encoding' header. 11:21:45 11:21:45 :param release_conn: 11:21:45 If False, then the urlopen call will not release the connection 11:21:45 back into the pool once a response is received (but will release if 11:21:45 you read the entire contents of the response such as when 11:21:45 `preload_content=True`). This is useful if you're not preloading 11:21:45 the response's content immediately. You will need to call 11:21:45 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:45 back into the pool. If None, it takes the value of ``preload_content`` 11:21:45 which defaults to ``True``. 11:21:45 11:21:45 :param bool chunked: 11:21:45 If True, urllib3 will send the body using chunked transfer 11:21:45 encoding. Otherwise, urllib3 will send the body using the standard 11:21:45 content-length form. Defaults to False. 11:21:45 11:21:45 :param int body_pos: 11:21:45 Position to seek to in file-like body in the event of a retry or 11:21:45 redirect. Typically this won't need to be set because urllib3 will 11:21:45 auto-populate the value when needed. 11:21:45 """ 11:21:45 parsed_url = parse_url(url) 11:21:45 destination_scheme = parsed_url.scheme 11:21:45 11:21:45 if headers is None: 11:21:45 headers = self.headers 11:21:45 11:21:45 if not isinstance(retries, Retry): 11:21:45 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:45 11:21:45 if release_conn is None: 11:21:45 release_conn = preload_content 11:21:45 11:21:45 # Check host 11:21:45 if assert_same_host and not self.is_same_host(url): 11:21:45 raise HostChangedError(self, url, retries) 11:21:45 11:21:45 # Ensure that the URL we're connecting to is properly encoded 11:21:45 if url.startswith("/"): 11:21:45 url = to_str(_encode_target(url)) 11:21:45 else: 11:21:45 url = to_str(parsed_url.url) 11:21:45 11:21:45 conn = None 11:21:45 11:21:45 # Track whether `conn` needs to be released before 11:21:45 # returning/raising/recursing. Update this variable if necessary, and 11:21:45 # leave `release_conn` constant throughout the function. That way, if 11:21:45 # the function recurses, the original value of `release_conn` will be 11:21:45 # passed down into the recursive call, and its value will be respected. 11:21:45 # 11:21:45 # See issue #651 [1] for details. 11:21:45 # 11:21:45 # [1] 11:21:45 release_this_conn = release_conn 11:21:45 11:21:45 http_tunnel_required = connection_requires_http_tunnel( 11:21:45 self.proxy, self.proxy_config, destination_scheme 11:21:45 ) 11:21:45 11:21:45 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:45 # have to copy the headers dict so we can safely change it without those 11:21:45 # changes being reflected in anyone else's copy. 11:21:45 if not http_tunnel_required: 11:21:45 headers = headers.copy() # type: ignore[attr-defined] 11:21:45 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:45 11:21:45 # Must keep the exception bound to a separate variable or else Python 3 11:21:45 # complains about UnboundLocalError. 11:21:45 err = None 11:21:45 11:21:45 # Keep track of whether we cleanly exited the except block. This 11:21:45 # ensures we do proper cleanup in finally. 11:21:45 clean_exit = False 11:21:45 11:21:45 # Rewind body position, if needed. Record current position 11:21:45 # for future rewinds in the event of a redirect/retry. 11:21:45 body_pos = set_file_position(body, body_pos) 11:21:45 11:21:45 try: 11:21:45 # Request a connection from the queue. 11:21:45 timeout_obj = self._get_timeout(timeout) 11:21:45 conn = self._get_conn(timeout=pool_timeout) 11:21:45 11:21:45 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:45 11:21:45 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:45 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:45 try: 11:21:45 self._prepare_proxy(conn) 11:21:45 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:45 self._raise_timeout( 11:21:45 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:45 ) 11:21:45 raise 11:21:45 11:21:45 # If we're going to release the connection in ``finally:``, then 11:21:45 # the response doesn't need to know about the connection. Otherwise 11:21:45 # it will also try to release it and we'll have a double-release 11:21:45 # mess. 11:21:45 response_conn = conn if not release_conn else None 11:21:45 11:21:45 # Make the request on the HTTPConnection object 11:21:45 > response = self._make_request( 11:21:45 conn, 11:21:45 method, 11:21:45 url, 11:21:45 timeout=timeout_obj, 11:21:45 body=body, 11:21:45 headers=headers, 11:21:45 chunked=chunked, 11:21:45 retries=retries, 11:21:45 response_conn=response_conn, 11:21:45 preload_content=preload_content, 11:21:45 decode_content=decode_content, 11:21:45 **response_kw, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:45 conn.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:45 self.endheaders() 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:45 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:45 self.send(msg) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:45 self.connect() 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:45 self.sock = self._new_conn() 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 except socket.gaierror as e: 11:21:45 raise NameResolutionError(self.host, self, e) from e 11:21:45 except SocketTimeout as e: 11:21:45 raise ConnectTimeoutError( 11:21:45 self, 11:21:45 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:45 ) from e 11:21:45 11:21:45 except OSError as e: 11:21:45 > raise NewConnectionError( 11:21:45 self, f"Failed to establish a new connection: {e}" 11:21:45 ) from e 11:21:45 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 > resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:45 retries = retries.increment( 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 method = 'POST' 11:21:45 url = '/rests/operations/transportpce-networkutils:init-xpdr-rdm-links' 11:21:45 response = None 11:21:45 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:45 _pool = 11:21:45 _stacktrace = 11:21:45 11:21:45 def increment( 11:21:45 self, 11:21:45 method: str | None = None, 11:21:45 url: str | None = None, 11:21:45 response: BaseHTTPResponse | None = None, 11:21:45 error: Exception | None = None, 11:21:45 _pool: ConnectionPool | None = None, 11:21:45 _stacktrace: TracebackType | None = None, 11:21:45 ) -> Self: 11:21:45 """Return a new Retry object with incremented retry counters. 11:21:45 11:21:45 :param response: A response object, or None, if the server did not 11:21:45 return a response. 11:21:45 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:45 :param Exception error: An error encountered during the request, or 11:21:45 None if the response was received successfully. 11:21:45 11:21:45 :return: A new ``Retry`` object. 11:21:45 """ 11:21:45 if self.total is False and error: 11:21:45 # Disabled, indicate to re-raise the error. 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 11:21:45 total = self.total 11:21:45 if total is not None: 11:21:45 total -= 1 11:21:45 11:21:45 connect = self.connect 11:21:45 read = self.read 11:21:45 redirect = self.redirect 11:21:45 status_count = self.status 11:21:45 other = self.other 11:21:45 cause = "unknown" 11:21:45 status = None 11:21:45 redirect_location = None 11:21:45 11:21:45 if error and self._is_connection_error(error): 11:21:45 # Connect retry? 11:21:45 if connect is False: 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif connect is not None: 11:21:45 connect -= 1 11:21:45 11:21:45 elif error and self._is_read_error(error): 11:21:45 # Read retry? 11:21:45 if read is False or method is None or not self._is_method_retryable(method): 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif read is not None: 11:21:45 read -= 1 11:21:45 11:21:45 elif error: 11:21:45 # Other retry? 11:21:45 if other is not None: 11:21:45 other -= 1 11:21:45 11:21:45 elif response and response.get_redirect_location(): 11:21:45 # Redirect retry? 11:21:45 if redirect is not None: 11:21:45 redirect -= 1 11:21:45 cause = "too many redirects" 11:21:45 response_redirect_location = response.get_redirect_location() 11:21:45 if response_redirect_location: 11:21:45 redirect_location = response_redirect_location 11:21:45 status = response.status 11:21:45 11:21:45 else: 11:21:45 # Incrementing because of a server error like a 500 in 11:21:45 # status_forcelist and the given method is in the allowed_methods 11:21:45 cause = ResponseError.GENERIC_ERROR 11:21:45 if response and response.status: 11:21:45 if status_count is not None: 11:21:45 status_count -= 1 11:21:45 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:45 status = response.status 11:21:45 11:21:45 history = self.history + ( 11:21:45 RequestHistory(method, url, error, status, redirect_location), 11:21:45 ) 11:21:45 11:21:45 new_retry = self.new( 11:21:45 total=total, 11:21:45 connect=connect, 11:21:45 read=read, 11:21:45 redirect=redirect, 11:21:45 status=status_count, 11:21:45 other=other, 11:21:45 history=history, 11:21:45 ) 11:21:45 11:21:45 if new_retry.is_exhausted(): 11:21:45 reason = error or ResponseError(cause) 11:21:45 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:45 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-xpdr-rdm-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:45 11:21:45 During handling of the above exception, another exception occurred: 11:21:45 11:21:45 self = 11:21:45 11:21:45 def test_05_connect_xpdrA_N1_to_roadmA_PP1(self): 11:21:45 > response = test_utils.transportpce_api_rpc_request( 11:21:45 'transportpce-networkutils', 'init-xpdr-rdm-links', 11:21:45 {'links-input': {'xpdr-node': 'XPDRA01', 'xpdr-num': '1', 'network-num': '1', 11:21:45 'rdm-node': 'ROADMA01', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX'}}) 11:21:45 11:21:45 transportpce_tests/1.2.1/test06_end2end.py:110: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 transportpce_tests/common/test_utils.py:687: in transportpce_api_rpc_request 11:21:45 response = post_request(url, data) 11:21:45 transportpce_tests/common/test_utils.py:142: in post_request 11:21:45 return requests.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:45 return session.request(method=method, url=url, **kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:45 resp = self.send(prep, **send_kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:45 r = adapter.send(request, **kwargs) 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 except (ProtocolError, OSError) as err: 11:21:45 raise ConnectionError(err, request=request) 11:21:45 11:21:45 except MaxRetryError as e: 11:21:45 if isinstance(e.reason, ConnectTimeoutError): 11:21:45 # TODO: Remove this in 3.0.0: see #2811 11:21:45 if not isinstance(e.reason, NewConnectionError): 11:21:45 raise ConnectTimeout(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, ResponseError): 11:21:45 raise RetryError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _ProxyError): 11:21:45 raise ProxyError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _SSLError): 11:21:45 # This branch is for urllib3 v1.22 and later. 11:21:45 raise SSLError(e, request=request) 11:21:45 11:21:45 > raise ConnectionError(e, request=request) 11:21:45 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-xpdr-rdm-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:45 ----------------------------- Captured stdout call ----------------------------- 11:21:45 execution of test_05_connect_xpdrA_N1_to_roadmA_PP1 11:21:45 ________ TransportPCEFulltesting.test_06_connect_roadmA_PP1_to_xpdrA_N1 ________ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 > sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:45 raise err 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:45 socket_options = [(6, 1, 1)] 11:21:45 11:21:45 def create_connection( 11:21:45 address: tuple[str, int], 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 source_address: tuple[str, int] | None = None, 11:21:45 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:45 ) -> socket.socket: 11:21:45 """Connect to *address* and return the socket object. 11:21:45 11:21:45 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:45 port)``) and return the socket object. Passing the optional 11:21:45 *timeout* parameter will set the timeout on the socket instance 11:21:45 before attempting to connect. If no *timeout* is supplied, the 11:21:45 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:45 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:45 for the socket to bind as a source address before making the connection. 11:21:45 An host of '' or port 0 tells the OS to use the default. 11:21:45 """ 11:21:45 11:21:45 host, port = address 11:21:45 if host.startswith("["): 11:21:45 host = host.strip("[]") 11:21:45 err = None 11:21:45 11:21:45 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:45 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:45 # The original create_connection function always returns all records. 11:21:45 family = allowed_gai_family() 11:21:45 11:21:45 try: 11:21:45 host.encode("idna") 11:21:45 except UnicodeError: 11:21:45 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:45 11:21:45 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:45 af, socktype, proto, canonname, sa = res 11:21:45 sock = None 11:21:45 try: 11:21:45 sock = socket.socket(af, socktype, proto) 11:21:45 11:21:45 # If provided, set socket level options before connecting. 11:21:45 _set_socket_options(sock, socket_options) 11:21:45 11:21:45 if timeout is not _DEFAULT_TIMEOUT: 11:21:45 sock.settimeout(timeout) 11:21:45 if source_address: 11:21:45 sock.bind(source_address) 11:21:45 > sock.connect(sa) 11:21:45 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 method = 'POST' 11:21:45 url = '/rests/operations/transportpce-networkutils:init-rdm-xpdr-links' 11:21:45 body = '{"input": {"links-input": {"xpdr-node": "XPDRA01", "xpdr-num": "1", "network-num": "1", "rdm-node": "ROADMA01", "srg-num": "1", "termination-point-num": "SRG1-PP1-TXRX"}}}' 11:21:45 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '171', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:45 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 redirect = False, assert_same_host = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:45 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:45 decode_content = False, response_kw = {} 11:21:45 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-networkutils:init-rdm-xpdr-links', query=None, fragment=None) 11:21:45 destination_scheme = None, conn = None, release_this_conn = True 11:21:45 http_tunnel_required = False, err = None, clean_exit = False 11:21:45 11:21:45 def urlopen( # type: ignore[override] 11:21:45 self, 11:21:45 method: str, 11:21:45 url: str, 11:21:45 body: _TYPE_BODY | None = None, 11:21:45 headers: typing.Mapping[str, str] | None = None, 11:21:45 retries: Retry | bool | int | None = None, 11:21:45 redirect: bool = True, 11:21:45 assert_same_host: bool = True, 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 pool_timeout: int | None = None, 11:21:45 release_conn: bool | None = None, 11:21:45 chunked: bool = False, 11:21:45 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:45 preload_content: bool = True, 11:21:45 decode_content: bool = True, 11:21:45 **response_kw: typing.Any, 11:21:45 ) -> BaseHTTPResponse: 11:21:45 """ 11:21:45 Get a connection from the pool and perform an HTTP request. This is the 11:21:45 lowest level call for making a request, so you'll need to specify all 11:21:45 the raw details. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 More commonly, it's appropriate to use a convenience method 11:21:45 such as :meth:`request`. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 `release_conn` will only behave as expected if 11:21:45 `preload_content=False` because we want to make 11:21:45 `preload_content=False` the default behaviour someday soon without 11:21:45 breaking backwards compatibility. 11:21:45 11:21:45 :param method: 11:21:45 HTTP request method (such as GET, POST, PUT, etc.) 11:21:45 11:21:45 :param url: 11:21:45 The URL to perform the request on. 11:21:45 11:21:45 :param body: 11:21:45 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:45 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:45 11:21:45 :param headers: 11:21:45 Dictionary of custom headers to send, such as User-Agent, 11:21:45 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:45 these headers completely replace any pool-specific headers. 11:21:45 11:21:45 :param retries: 11:21:45 Configure the number of retries to allow before raising a 11:21:45 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:45 11:21:45 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:45 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:45 over different types of retries. 11:21:45 Pass an integer number to retry connection errors that many times, 11:21:45 but no other types of errors. Pass zero to never retry. 11:21:45 11:21:45 If ``False``, then retries are disabled and any exception is raised 11:21:45 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:45 the redirect response will be returned. 11:21:45 11:21:45 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:45 11:21:45 :param redirect: 11:21:45 If True, automatically handle redirects (status codes 301, 302, 11:21:45 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:45 will disable redirect, too. 11:21:45 11:21:45 :param assert_same_host: 11:21:45 If ``True``, will make sure that the host of the pool requests is 11:21:45 consistent else will raise HostChangedError. When ``False``, you can 11:21:45 use the pool on an HTTP proxy and request foreign hosts. 11:21:45 11:21:45 :param timeout: 11:21:45 If specified, overrides the default timeout for this one 11:21:45 request. It may be a float (in seconds) or an instance of 11:21:45 :class:`urllib3.util.Timeout`. 11:21:45 11:21:45 :param pool_timeout: 11:21:45 If set and the pool is set to block=True, then this method will 11:21:45 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:45 connection is available within the time period. 11:21:45 11:21:45 :param bool preload_content: 11:21:45 If True, the response's body will be preloaded into memory. 11:21:45 11:21:45 :param bool decode_content: 11:21:45 If True, will attempt to decode the body based on the 11:21:45 'content-encoding' header. 11:21:45 11:21:45 :param release_conn: 11:21:45 If False, then the urlopen call will not release the connection 11:21:45 back into the pool once a response is received (but will release if 11:21:45 you read the entire contents of the response such as when 11:21:45 `preload_content=True`). This is useful if you're not preloading 11:21:45 the response's content immediately. You will need to call 11:21:45 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:45 back into the pool. If None, it takes the value of ``preload_content`` 11:21:45 which defaults to ``True``. 11:21:45 11:21:45 :param bool chunked: 11:21:45 If True, urllib3 will send the body using chunked transfer 11:21:45 encoding. Otherwise, urllib3 will send the body using the standard 11:21:45 content-length form. Defaults to False. 11:21:45 11:21:45 :param int body_pos: 11:21:45 Position to seek to in file-like body in the event of a retry or 11:21:45 redirect. Typically this won't need to be set because urllib3 will 11:21:45 auto-populate the value when needed. 11:21:45 """ 11:21:45 parsed_url = parse_url(url) 11:21:45 destination_scheme = parsed_url.scheme 11:21:45 11:21:45 if headers is None: 11:21:45 headers = self.headers 11:21:45 11:21:45 if not isinstance(retries, Retry): 11:21:45 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:45 11:21:45 if release_conn is None: 11:21:45 release_conn = preload_content 11:21:45 11:21:45 # Check host 11:21:45 if assert_same_host and not self.is_same_host(url): 11:21:45 raise HostChangedError(self, url, retries) 11:21:45 11:21:45 # Ensure that the URL we're connecting to is properly encoded 11:21:45 if url.startswith("/"): 11:21:45 url = to_str(_encode_target(url)) 11:21:45 else: 11:21:45 url = to_str(parsed_url.url) 11:21:45 11:21:45 conn = None 11:21:45 11:21:45 # Track whether `conn` needs to be released before 11:21:45 # returning/raising/recursing. Update this variable if necessary, and 11:21:45 # leave `release_conn` constant throughout the function. That way, if 11:21:45 # the function recurses, the original value of `release_conn` will be 11:21:45 # passed down into the recursive call, and its value will be respected. 11:21:45 # 11:21:45 # See issue #651 [1] for details. 11:21:45 # 11:21:45 # [1] 11:21:45 release_this_conn = release_conn 11:21:45 11:21:45 http_tunnel_required = connection_requires_http_tunnel( 11:21:45 self.proxy, self.proxy_config, destination_scheme 11:21:45 ) 11:21:45 11:21:45 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:45 # have to copy the headers dict so we can safely change it without those 11:21:45 # changes being reflected in anyone else's copy. 11:21:45 if not http_tunnel_required: 11:21:45 headers = headers.copy() # type: ignore[attr-defined] 11:21:45 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:45 11:21:45 # Must keep the exception bound to a separate variable or else Python 3 11:21:45 # complains about UnboundLocalError. 11:21:45 err = None 11:21:45 11:21:45 # Keep track of whether we cleanly exited the except block. This 11:21:45 # ensures we do proper cleanup in finally. 11:21:45 clean_exit = False 11:21:45 11:21:45 # Rewind body position, if needed. Record current position 11:21:45 # for future rewinds in the event of a redirect/retry. 11:21:45 body_pos = set_file_position(body, body_pos) 11:21:45 11:21:45 try: 11:21:45 # Request a connection from the queue. 11:21:45 timeout_obj = self._get_timeout(timeout) 11:21:45 conn = self._get_conn(timeout=pool_timeout) 11:21:45 11:21:45 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:45 11:21:45 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:45 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:45 try: 11:21:45 self._prepare_proxy(conn) 11:21:45 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:45 self._raise_timeout( 11:21:45 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:45 ) 11:21:45 raise 11:21:45 11:21:45 # If we're going to release the connection in ``finally:``, then 11:21:45 # the response doesn't need to know about the connection. Otherwise 11:21:45 # it will also try to release it and we'll have a double-release 11:21:45 # mess. 11:21:45 response_conn = conn if not release_conn else None 11:21:45 11:21:45 # Make the request on the HTTPConnection object 11:21:45 > response = self._make_request( 11:21:45 conn, 11:21:45 method, 11:21:45 url, 11:21:45 timeout=timeout_obj, 11:21:45 body=body, 11:21:45 headers=headers, 11:21:45 chunked=chunked, 11:21:45 retries=retries, 11:21:45 response_conn=response_conn, 11:21:45 preload_content=preload_content, 11:21:45 decode_content=decode_content, 11:21:45 **response_kw, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:45 conn.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:45 self.endheaders() 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:45 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:45 self.send(msg) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:45 self.connect() 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:45 self.sock = self._new_conn() 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 except socket.gaierror as e: 11:21:45 raise NameResolutionError(self.host, self, e) from e 11:21:45 except SocketTimeout as e: 11:21:45 raise ConnectTimeoutError( 11:21:45 self, 11:21:45 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:45 ) from e 11:21:45 11:21:45 except OSError as e: 11:21:45 > raise NewConnectionError( 11:21:45 self, f"Failed to establish a new connection: {e}" 11:21:45 ) from e 11:21:45 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 > resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:45 retries = retries.increment( 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 method = 'POST' 11:21:45 url = '/rests/operations/transportpce-networkutils:init-rdm-xpdr-links' 11:21:45 response = None 11:21:45 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:45 _pool = 11:21:45 _stacktrace = 11:21:45 11:21:45 def increment( 11:21:45 self, 11:21:45 method: str | None = None, 11:21:45 url: str | None = None, 11:21:45 response: BaseHTTPResponse | None = None, 11:21:45 error: Exception | None = None, 11:21:45 _pool: ConnectionPool | None = None, 11:21:45 _stacktrace: TracebackType | None = None, 11:21:45 ) -> Self: 11:21:45 """Return a new Retry object with incremented retry counters. 11:21:45 11:21:45 :param response: A response object, or None, if the server did not 11:21:45 return a response. 11:21:45 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:45 :param Exception error: An error encountered during the request, or 11:21:45 None if the response was received successfully. 11:21:45 11:21:45 :return: A new ``Retry`` object. 11:21:45 """ 11:21:45 if self.total is False and error: 11:21:45 # Disabled, indicate to re-raise the error. 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 11:21:45 total = self.total 11:21:45 if total is not None: 11:21:45 total -= 1 11:21:45 11:21:45 connect = self.connect 11:21:45 read = self.read 11:21:45 redirect = self.redirect 11:21:45 status_count = self.status 11:21:45 other = self.other 11:21:45 cause = "unknown" 11:21:45 status = None 11:21:45 redirect_location = None 11:21:45 11:21:45 if error and self._is_connection_error(error): 11:21:45 # Connect retry? 11:21:45 if connect is False: 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif connect is not None: 11:21:45 connect -= 1 11:21:45 11:21:45 elif error and self._is_read_error(error): 11:21:45 # Read retry? 11:21:45 if read is False or method is None or not self._is_method_retryable(method): 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif read is not None: 11:21:45 read -= 1 11:21:45 11:21:45 elif error: 11:21:45 # Other retry? 11:21:45 if other is not None: 11:21:45 other -= 1 11:21:45 11:21:45 elif response and response.get_redirect_location(): 11:21:45 # Redirect retry? 11:21:45 if redirect is not None: 11:21:45 redirect -= 1 11:21:45 cause = "too many redirects" 11:21:45 response_redirect_location = response.get_redirect_location() 11:21:45 if response_redirect_location: 11:21:45 redirect_location = response_redirect_location 11:21:45 status = response.status 11:21:45 11:21:45 else: 11:21:45 # Incrementing because of a server error like a 500 in 11:21:45 # status_forcelist and the given method is in the allowed_methods 11:21:45 cause = ResponseError.GENERIC_ERROR 11:21:45 if response and response.status: 11:21:45 if status_count is not None: 11:21:45 status_count -= 1 11:21:45 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:45 status = response.status 11:21:45 11:21:45 history = self.history + ( 11:21:45 RequestHistory(method, url, error, status, redirect_location), 11:21:45 ) 11:21:45 11:21:45 new_retry = self.new( 11:21:45 total=total, 11:21:45 connect=connect, 11:21:45 read=read, 11:21:45 redirect=redirect, 11:21:45 status=status_count, 11:21:45 other=other, 11:21:45 history=history, 11:21:45 ) 11:21:45 11:21:45 if new_retry.is_exhausted(): 11:21:45 reason = error or ResponseError(cause) 11:21:45 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:45 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-rdm-xpdr-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:45 11:21:45 During handling of the above exception, another exception occurred: 11:21:45 11:21:45 self = 11:21:45 11:21:45 def test_06_connect_roadmA_PP1_to_xpdrA_N1(self): 11:21:45 > response = test_utils.transportpce_api_rpc_request( 11:21:45 'transportpce-networkutils', 'init-rdm-xpdr-links', 11:21:45 {'links-input': {'xpdr-node': 'XPDRA01', 'xpdr-num': '1', 'network-num': '1', 11:21:45 'rdm-node': 'ROADMA01', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX'}}) 11:21:45 11:21:45 transportpce_tests/1.2.1/test06_end2end.py:119: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 transportpce_tests/common/test_utils.py:687: in transportpce_api_rpc_request 11:21:45 response = post_request(url, data) 11:21:45 transportpce_tests/common/test_utils.py:142: in post_request 11:21:45 return requests.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:45 return session.request(method=method, url=url, **kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:45 resp = self.send(prep, **send_kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:45 r = adapter.send(request, **kwargs) 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 except (ProtocolError, OSError) as err: 11:21:45 raise ConnectionError(err, request=request) 11:21:45 11:21:45 except MaxRetryError as e: 11:21:45 if isinstance(e.reason, ConnectTimeoutError): 11:21:45 # TODO: Remove this in 3.0.0: see #2811 11:21:45 if not isinstance(e.reason, NewConnectionError): 11:21:45 raise ConnectTimeout(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, ResponseError): 11:21:45 raise RetryError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _ProxyError): 11:21:45 raise ProxyError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _SSLError): 11:21:45 # This branch is for urllib3 v1.22 and later. 11:21:45 raise SSLError(e, request=request) 11:21:45 11:21:45 > raise ConnectionError(e, request=request) 11:21:45 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-rdm-xpdr-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:45 ----------------------------- Captured stdout call ----------------------------- 11:21:45 execution of test_06_connect_roadmA_PP1_to_xpdrA_N1 11:21:45 ________ TransportPCEFulltesting.test_07_connect_xpdrC_N1_to_roadmC_PP1 ________ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 > sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:45 raise err 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:45 socket_options = [(6, 1, 1)] 11:21:45 11:21:45 def create_connection( 11:21:45 address: tuple[str, int], 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 source_address: tuple[str, int] | None = None, 11:21:45 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:45 ) -> socket.socket: 11:21:45 """Connect to *address* and return the socket object. 11:21:45 11:21:45 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:45 port)``) and return the socket object. Passing the optional 11:21:45 *timeout* parameter will set the timeout on the socket instance 11:21:45 before attempting to connect. If no *timeout* is supplied, the 11:21:45 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:45 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:45 for the socket to bind as a source address before making the connection. 11:21:45 An host of '' or port 0 tells the OS to use the default. 11:21:45 """ 11:21:45 11:21:45 host, port = address 11:21:45 if host.startswith("["): 11:21:45 host = host.strip("[]") 11:21:45 err = None 11:21:45 11:21:45 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:45 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:45 # The original create_connection function always returns all records. 11:21:45 family = allowed_gai_family() 11:21:45 11:21:45 try: 11:21:45 host.encode("idna") 11:21:45 except UnicodeError: 11:21:45 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:45 11:21:45 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:45 af, socktype, proto, canonname, sa = res 11:21:45 sock = None 11:21:45 try: 11:21:45 sock = socket.socket(af, socktype, proto) 11:21:45 11:21:45 # If provided, set socket level options before connecting. 11:21:45 _set_socket_options(sock, socket_options) 11:21:45 11:21:45 if timeout is not _DEFAULT_TIMEOUT: 11:21:45 sock.settimeout(timeout) 11:21:45 if source_address: 11:21:45 sock.bind(source_address) 11:21:45 > sock.connect(sa) 11:21:45 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 method = 'POST' 11:21:45 url = '/rests/operations/transportpce-networkutils:init-xpdr-rdm-links' 11:21:45 body = '{"input": {"links-input": {"xpdr-node": "XPDRC01", "xpdr-num": "1", "network-num": "1", "rdm-node": "ROADMC01", "srg-num": "1", "termination-point-num": "SRG1-PP1-TXRX"}}}' 11:21:45 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '171', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:45 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 redirect = False, assert_same_host = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:45 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:45 decode_content = False, response_kw = {} 11:21:45 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-networkutils:init-xpdr-rdm-links', query=None, fragment=None) 11:21:45 destination_scheme = None, conn = None, release_this_conn = True 11:21:45 http_tunnel_required = False, err = None, clean_exit = False 11:21:45 11:21:45 def urlopen( # type: ignore[override] 11:21:45 self, 11:21:45 method: str, 11:21:45 url: str, 11:21:45 body: _TYPE_BODY | None = None, 11:21:45 headers: typing.Mapping[str, str] | None = None, 11:21:45 retries: Retry | bool | int | None = None, 11:21:45 redirect: bool = True, 11:21:45 assert_same_host: bool = True, 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 pool_timeout: int | None = None, 11:21:45 release_conn: bool | None = None, 11:21:45 chunked: bool = False, 11:21:45 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:45 preload_content: bool = True, 11:21:45 decode_content: bool = True, 11:21:45 **response_kw: typing.Any, 11:21:45 ) -> BaseHTTPResponse: 11:21:45 """ 11:21:45 Get a connection from the pool and perform an HTTP request. This is the 11:21:45 lowest level call for making a request, so you'll need to specify all 11:21:45 the raw details. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 More commonly, it's appropriate to use a convenience method 11:21:45 such as :meth:`request`. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 `release_conn` will only behave as expected if 11:21:45 `preload_content=False` because we want to make 11:21:45 `preload_content=False` the default behaviour someday soon without 11:21:45 breaking backwards compatibility. 11:21:45 11:21:45 :param method: 11:21:45 HTTP request method (such as GET, POST, PUT, etc.) 11:21:45 11:21:45 :param url: 11:21:45 The URL to perform the request on. 11:21:45 11:21:45 :param body: 11:21:45 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:45 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:45 11:21:45 :param headers: 11:21:45 Dictionary of custom headers to send, such as User-Agent, 11:21:45 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:45 these headers completely replace any pool-specific headers. 11:21:45 11:21:45 :param retries: 11:21:45 Configure the number of retries to allow before raising a 11:21:45 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:45 11:21:45 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:45 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:45 over different types of retries. 11:21:45 Pass an integer number to retry connection errors that many times, 11:21:45 but no other types of errors. Pass zero to never retry. 11:21:45 11:21:45 If ``False``, then retries are disabled and any exception is raised 11:21:45 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:45 the redirect response will be returned. 11:21:45 11:21:45 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:45 11:21:45 :param redirect: 11:21:45 If True, automatically handle redirects (status codes 301, 302, 11:21:45 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:45 will disable redirect, too. 11:21:45 11:21:45 :param assert_same_host: 11:21:45 If ``True``, will make sure that the host of the pool requests is 11:21:45 consistent else will raise HostChangedError. When ``False``, you can 11:21:45 use the pool on an HTTP proxy and request foreign hosts. 11:21:45 11:21:45 :param timeout: 11:21:45 If specified, overrides the default timeout for this one 11:21:45 request. It may be a float (in seconds) or an instance of 11:21:45 :class:`urllib3.util.Timeout`. 11:21:45 11:21:45 :param pool_timeout: 11:21:45 If set and the pool is set to block=True, then this method will 11:21:45 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:45 connection is available within the time period. 11:21:45 11:21:45 :param bool preload_content: 11:21:45 If True, the response's body will be preloaded into memory. 11:21:45 11:21:45 :param bool decode_content: 11:21:45 If True, will attempt to decode the body based on the 11:21:45 'content-encoding' header. 11:21:45 11:21:45 :param release_conn: 11:21:45 If False, then the urlopen call will not release the connection 11:21:45 back into the pool once a response is received (but will release if 11:21:45 you read the entire contents of the response such as when 11:21:45 `preload_content=True`). This is useful if you're not preloading 11:21:45 the response's content immediately. You will need to call 11:21:45 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:45 back into the pool. If None, it takes the value of ``preload_content`` 11:21:45 which defaults to ``True``. 11:21:45 11:21:45 :param bool chunked: 11:21:45 If True, urllib3 will send the body using chunked transfer 11:21:45 encoding. Otherwise, urllib3 will send the body using the standard 11:21:45 content-length form. Defaults to False. 11:21:45 11:21:45 :param int body_pos: 11:21:45 Position to seek to in file-like body in the event of a retry or 11:21:45 redirect. Typically this won't need to be set because urllib3 will 11:21:45 auto-populate the value when needed. 11:21:45 """ 11:21:45 parsed_url = parse_url(url) 11:21:45 destination_scheme = parsed_url.scheme 11:21:45 11:21:45 if headers is None: 11:21:45 headers = self.headers 11:21:45 11:21:45 if not isinstance(retries, Retry): 11:21:45 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:45 11:21:45 if release_conn is None: 11:21:45 release_conn = preload_content 11:21:45 11:21:45 # Check host 11:21:45 if assert_same_host and not self.is_same_host(url): 11:21:45 raise HostChangedError(self, url, retries) 11:21:45 11:21:45 # Ensure that the URL we're connecting to is properly encoded 11:21:45 if url.startswith("/"): 11:21:45 url = to_str(_encode_target(url)) 11:21:45 else: 11:21:45 url = to_str(parsed_url.url) 11:21:45 11:21:45 conn = None 11:21:45 11:21:45 # Track whether `conn` needs to be released before 11:21:45 # returning/raising/recursing. Update this variable if necessary, and 11:21:45 # leave `release_conn` constant throughout the function. That way, if 11:21:45 # the function recurses, the original value of `release_conn` will be 11:21:45 # passed down into the recursive call, and its value will be respected. 11:21:45 # 11:21:45 # See issue #651 [1] for details. 11:21:45 # 11:21:45 # [1] 11:21:45 release_this_conn = release_conn 11:21:45 11:21:45 http_tunnel_required = connection_requires_http_tunnel( 11:21:45 self.proxy, self.proxy_config, destination_scheme 11:21:45 ) 11:21:45 11:21:45 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:45 # have to copy the headers dict so we can safely change it without those 11:21:45 # changes being reflected in anyone else's copy. 11:21:45 if not http_tunnel_required: 11:21:45 headers = headers.copy() # type: ignore[attr-defined] 11:21:45 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:45 11:21:45 # Must keep the exception bound to a separate variable or else Python 3 11:21:45 # complains about UnboundLocalError. 11:21:45 err = None 11:21:45 11:21:45 # Keep track of whether we cleanly exited the except block. This 11:21:45 # ensures we do proper cleanup in finally. 11:21:45 clean_exit = False 11:21:45 11:21:45 # Rewind body position, if needed. Record current position 11:21:45 # for future rewinds in the event of a redirect/retry. 11:21:45 body_pos = set_file_position(body, body_pos) 11:21:45 11:21:45 try: 11:21:45 # Request a connection from the queue. 11:21:45 timeout_obj = self._get_timeout(timeout) 11:21:45 conn = self._get_conn(timeout=pool_timeout) 11:21:45 11:21:45 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:45 11:21:45 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:45 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:45 try: 11:21:45 self._prepare_proxy(conn) 11:21:45 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:45 self._raise_timeout( 11:21:45 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:45 ) 11:21:45 raise 11:21:45 11:21:45 # If we're going to release the connection in ``finally:``, then 11:21:45 # the response doesn't need to know about the connection. Otherwise 11:21:45 # it will also try to release it and we'll have a double-release 11:21:45 # mess. 11:21:45 response_conn = conn if not release_conn else None 11:21:45 11:21:45 # Make the request on the HTTPConnection object 11:21:45 > response = self._make_request( 11:21:45 conn, 11:21:45 method, 11:21:45 url, 11:21:45 timeout=timeout_obj, 11:21:45 body=body, 11:21:45 headers=headers, 11:21:45 chunked=chunked, 11:21:45 retries=retries, 11:21:45 response_conn=response_conn, 11:21:45 preload_content=preload_content, 11:21:45 decode_content=decode_content, 11:21:45 **response_kw, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:45 conn.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:45 self.endheaders() 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:45 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:45 self.send(msg) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:45 self.connect() 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:45 self.sock = self._new_conn() 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 except socket.gaierror as e: 11:21:45 raise NameResolutionError(self.host, self, e) from e 11:21:45 except SocketTimeout as e: 11:21:45 raise ConnectTimeoutError( 11:21:45 self, 11:21:45 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:45 ) from e 11:21:45 11:21:45 except OSError as e: 11:21:45 > raise NewConnectionError( 11:21:45 self, f"Failed to establish a new connection: {e}" 11:21:45 ) from e 11:21:45 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 > resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:45 retries = retries.increment( 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 method = 'POST' 11:21:45 url = '/rests/operations/transportpce-networkutils:init-xpdr-rdm-links' 11:21:45 response = None 11:21:45 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:45 _pool = 11:21:45 _stacktrace = 11:21:45 11:21:45 def increment( 11:21:45 self, 11:21:45 method: str | None = None, 11:21:45 url: str | None = None, 11:21:45 response: BaseHTTPResponse | None = None, 11:21:45 error: Exception | None = None, 11:21:45 _pool: ConnectionPool | None = None, 11:21:45 _stacktrace: TracebackType | None = None, 11:21:45 ) -> Self: 11:21:45 """Return a new Retry object with incremented retry counters. 11:21:45 11:21:45 :param response: A response object, or None, if the server did not 11:21:45 return a response. 11:21:45 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:45 :param Exception error: An error encountered during the request, or 11:21:45 None if the response was received successfully. 11:21:45 11:21:45 :return: A new ``Retry`` object. 11:21:45 """ 11:21:45 if self.total is False and error: 11:21:45 # Disabled, indicate to re-raise the error. 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 11:21:45 total = self.total 11:21:45 if total is not None: 11:21:45 total -= 1 11:21:45 11:21:45 connect = self.connect 11:21:45 read = self.read 11:21:45 redirect = self.redirect 11:21:45 status_count = self.status 11:21:45 other = self.other 11:21:45 cause = "unknown" 11:21:45 status = None 11:21:45 redirect_location = None 11:21:45 11:21:45 if error and self._is_connection_error(error): 11:21:45 # Connect retry? 11:21:45 if connect is False: 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif connect is not None: 11:21:45 connect -= 1 11:21:45 11:21:45 elif error and self._is_read_error(error): 11:21:45 # Read retry? 11:21:45 if read is False or method is None or not self._is_method_retryable(method): 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif read is not None: 11:21:45 read -= 1 11:21:45 11:21:45 elif error: 11:21:45 # Other retry? 11:21:45 if other is not None: 11:21:45 other -= 1 11:21:45 11:21:45 elif response and response.get_redirect_location(): 11:21:45 # Redirect retry? 11:21:45 if redirect is not None: 11:21:45 redirect -= 1 11:21:45 cause = "too many redirects" 11:21:45 response_redirect_location = response.get_redirect_location() 11:21:45 if response_redirect_location: 11:21:45 redirect_location = response_redirect_location 11:21:45 status = response.status 11:21:45 11:21:45 else: 11:21:45 # Incrementing because of a server error like a 500 in 11:21:45 # status_forcelist and the given method is in the allowed_methods 11:21:45 cause = ResponseError.GENERIC_ERROR 11:21:45 if response and response.status: 11:21:45 if status_count is not None: 11:21:45 status_count -= 1 11:21:45 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:45 status = response.status 11:21:45 11:21:45 history = self.history + ( 11:21:45 RequestHistory(method, url, error, status, redirect_location), 11:21:45 ) 11:21:45 11:21:45 new_retry = self.new( 11:21:45 total=total, 11:21:45 connect=connect, 11:21:45 read=read, 11:21:45 redirect=redirect, 11:21:45 status=status_count, 11:21:45 other=other, 11:21:45 history=history, 11:21:45 ) 11:21:45 11:21:45 if new_retry.is_exhausted(): 11:21:45 reason = error or ResponseError(cause) 11:21:45 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:45 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-xpdr-rdm-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:45 11:21:45 During handling of the above exception, another exception occurred: 11:21:45 11:21:45 self = 11:21:45 11:21:45 def test_07_connect_xpdrC_N1_to_roadmC_PP1(self): 11:21:45 > response = test_utils.transportpce_api_rpc_request( 11:21:45 'transportpce-networkutils', 'init-xpdr-rdm-links', 11:21:45 {'links-input': {'xpdr-node': 'XPDRC01', 'xpdr-num': '1', 'network-num': '1', 11:21:45 'rdm-node': 'ROADMC01', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX'}}) 11:21:45 11:21:45 transportpce_tests/1.2.1/test06_end2end.py:128: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 transportpce_tests/common/test_utils.py:687: in transportpce_api_rpc_request 11:21:45 response = post_request(url, data) 11:21:45 transportpce_tests/common/test_utils.py:142: in post_request 11:21:45 return requests.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:45 return session.request(method=method, url=url, **kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:45 resp = self.send(prep, **send_kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:45 r = adapter.send(request, **kwargs) 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 except (ProtocolError, OSError) as err: 11:21:45 raise ConnectionError(err, request=request) 11:21:45 11:21:45 except MaxRetryError as e: 11:21:45 if isinstance(e.reason, ConnectTimeoutError): 11:21:45 # TODO: Remove this in 3.0.0: see #2811 11:21:45 if not isinstance(e.reason, NewConnectionError): 11:21:45 raise ConnectTimeout(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, ResponseError): 11:21:45 raise RetryError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _ProxyError): 11:21:45 raise ProxyError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _SSLError): 11:21:45 # This branch is for urllib3 v1.22 and later. 11:21:45 raise SSLError(e, request=request) 11:21:45 11:21:45 > raise ConnectionError(e, request=request) 11:21:45 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-xpdr-rdm-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:45 ----------------------------- Captured stdout call ----------------------------- 11:21:45 execution of test_07_connect_xpdrC_N1_to_roadmC_PP1 11:21:45 ________ TransportPCEFulltesting.test_08_connect_roadmC_PP1_to_xpdrC_N1 ________ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 > sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:45 raise err 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:45 socket_options = [(6, 1, 1)] 11:21:45 11:21:45 def create_connection( 11:21:45 address: tuple[str, int], 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 source_address: tuple[str, int] | None = None, 11:21:45 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:45 ) -> socket.socket: 11:21:45 """Connect to *address* and return the socket object. 11:21:45 11:21:45 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:45 port)``) and return the socket object. Passing the optional 11:21:45 *timeout* parameter will set the timeout on the socket instance 11:21:45 before attempting to connect. If no *timeout* is supplied, the 11:21:45 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:45 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:45 for the socket to bind as a source address before making the connection. 11:21:45 An host of '' or port 0 tells the OS to use the default. 11:21:45 """ 11:21:45 11:21:45 host, port = address 11:21:45 if host.startswith("["): 11:21:45 host = host.strip("[]") 11:21:45 err = None 11:21:45 11:21:45 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:45 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:45 # The original create_connection function always returns all records. 11:21:45 family = allowed_gai_family() 11:21:45 11:21:45 try: 11:21:45 host.encode("idna") 11:21:45 except UnicodeError: 11:21:45 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:45 11:21:45 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:45 af, socktype, proto, canonname, sa = res 11:21:45 sock = None 11:21:45 try: 11:21:45 sock = socket.socket(af, socktype, proto) 11:21:45 11:21:45 # If provided, set socket level options before connecting. 11:21:45 _set_socket_options(sock, socket_options) 11:21:45 11:21:45 if timeout is not _DEFAULT_TIMEOUT: 11:21:45 sock.settimeout(timeout) 11:21:45 if source_address: 11:21:45 sock.bind(source_address) 11:21:45 > sock.connect(sa) 11:21:45 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 method = 'POST' 11:21:45 url = '/rests/operations/transportpce-networkutils:init-rdm-xpdr-links' 11:21:45 body = '{"input": {"links-input": {"xpdr-node": "XPDRC01", "xpdr-num": "1", "network-num": "1", "rdm-node": "ROADMC01", "srg-num": "1", "termination-point-num": "SRG1-PP1-TXRX"}}}' 11:21:45 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '171', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:45 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 redirect = False, assert_same_host = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:45 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:45 decode_content = False, response_kw = {} 11:21:45 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-networkutils:init-rdm-xpdr-links', query=None, fragment=None) 11:21:45 destination_scheme = None, conn = None, release_this_conn = True 11:21:45 http_tunnel_required = False, err = None, clean_exit = False 11:21:45 11:21:45 def urlopen( # type: ignore[override] 11:21:45 self, 11:21:45 method: str, 11:21:45 url: str, 11:21:45 body: _TYPE_BODY | None = None, 11:21:45 headers: typing.Mapping[str, str] | None = None, 11:21:45 retries: Retry | bool | int | None = None, 11:21:45 redirect: bool = True, 11:21:45 assert_same_host: bool = True, 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 pool_timeout: int | None = None, 11:21:45 release_conn: bool | None = None, 11:21:45 chunked: bool = False, 11:21:45 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:45 preload_content: bool = True, 11:21:45 decode_content: bool = True, 11:21:45 **response_kw: typing.Any, 11:21:45 ) -> BaseHTTPResponse: 11:21:45 """ 11:21:45 Get a connection from the pool and perform an HTTP request. This is the 11:21:45 lowest level call for making a request, so you'll need to specify all 11:21:45 the raw details. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 More commonly, it's appropriate to use a convenience method 11:21:45 such as :meth:`request`. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 `release_conn` will only behave as expected if 11:21:45 `preload_content=False` because we want to make 11:21:45 `preload_content=False` the default behaviour someday soon without 11:21:45 breaking backwards compatibility. 11:21:45 11:21:45 :param method: 11:21:45 HTTP request method (such as GET, POST, PUT, etc.) 11:21:45 11:21:45 :param url: 11:21:45 The URL to perform the request on. 11:21:45 11:21:45 :param body: 11:21:45 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:45 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:45 11:21:45 :param headers: 11:21:45 Dictionary of custom headers to send, such as User-Agent, 11:21:45 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:45 these headers completely replace any pool-specific headers. 11:21:45 11:21:45 :param retries: 11:21:45 Configure the number of retries to allow before raising a 11:21:45 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:45 11:21:45 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:45 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:45 over different types of retries. 11:21:45 Pass an integer number to retry connection errors that many times, 11:21:45 but no other types of errors. Pass zero to never retry. 11:21:45 11:21:45 If ``False``, then retries are disabled and any exception is raised 11:21:45 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:45 the redirect response will be returned. 11:21:45 11:21:45 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:45 11:21:45 :param redirect: 11:21:45 If True, automatically handle redirects (status codes 301, 302, 11:21:45 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:45 will disable redirect, too. 11:21:45 11:21:45 :param assert_same_host: 11:21:45 If ``True``, will make sure that the host of the pool requests is 11:21:45 consistent else will raise HostChangedError. When ``False``, you can 11:21:45 use the pool on an HTTP proxy and request foreign hosts. 11:21:45 11:21:45 :param timeout: 11:21:45 If specified, overrides the default timeout for this one 11:21:45 request. It may be a float (in seconds) or an instance of 11:21:45 :class:`urllib3.util.Timeout`. 11:21:45 11:21:45 :param pool_timeout: 11:21:45 If set and the pool is set to block=True, then this method will 11:21:45 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:45 connection is available within the time period. 11:21:45 11:21:45 :param bool preload_content: 11:21:45 If True, the response's body will be preloaded into memory. 11:21:45 11:21:45 :param bool decode_content: 11:21:45 If True, will attempt to decode the body based on the 11:21:45 'content-encoding' header. 11:21:45 11:21:45 :param release_conn: 11:21:45 If False, then the urlopen call will not release the connection 11:21:45 back into the pool once a response is received (but will release if 11:21:45 you read the entire contents of the response such as when 11:21:45 `preload_content=True`). This is useful if you're not preloading 11:21:45 the response's content immediately. You will need to call 11:21:45 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:45 back into the pool. If None, it takes the value of ``preload_content`` 11:21:45 which defaults to ``True``. 11:21:45 11:21:45 :param bool chunked: 11:21:45 If True, urllib3 will send the body using chunked transfer 11:21:45 encoding. Otherwise, urllib3 will send the body using the standard 11:21:45 content-length form. Defaults to False. 11:21:45 11:21:45 :param int body_pos: 11:21:45 Position to seek to in file-like body in the event of a retry or 11:21:45 redirect. Typically this won't need to be set because urllib3 will 11:21:45 auto-populate the value when needed. 11:21:45 """ 11:21:45 parsed_url = parse_url(url) 11:21:45 destination_scheme = parsed_url.scheme 11:21:45 11:21:45 if headers is None: 11:21:45 headers = self.headers 11:21:45 11:21:45 if not isinstance(retries, Retry): 11:21:45 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:45 11:21:45 if release_conn is None: 11:21:45 release_conn = preload_content 11:21:45 11:21:45 # Check host 11:21:45 if assert_same_host and not self.is_same_host(url): 11:21:45 raise HostChangedError(self, url, retries) 11:21:45 11:21:45 # Ensure that the URL we're connecting to is properly encoded 11:21:45 if url.startswith("/"): 11:21:45 url = to_str(_encode_target(url)) 11:21:45 else: 11:21:45 url = to_str(parsed_url.url) 11:21:45 11:21:45 conn = None 11:21:45 11:21:45 # Track whether `conn` needs to be released before 11:21:45 # returning/raising/recursing. Update this variable if necessary, and 11:21:45 # leave `release_conn` constant throughout the function. That way, if 11:21:45 # the function recurses, the original value of `release_conn` will be 11:21:45 # passed down into the recursive call, and its value will be respected. 11:21:45 # 11:21:45 # See issue #651 [1] for details. 11:21:45 # 11:21:45 # [1] 11:21:45 release_this_conn = release_conn 11:21:45 11:21:45 http_tunnel_required = connection_requires_http_tunnel( 11:21:45 self.proxy, self.proxy_config, destination_scheme 11:21:45 ) 11:21:45 11:21:45 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:45 # have to copy the headers dict so we can safely change it without those 11:21:45 # changes being reflected in anyone else's copy. 11:21:45 if not http_tunnel_required: 11:21:45 headers = headers.copy() # type: ignore[attr-defined] 11:21:45 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:45 11:21:45 # Must keep the exception bound to a separate variable or else Python 3 11:21:45 # complains about UnboundLocalError. 11:21:45 err = None 11:21:45 11:21:45 # Keep track of whether we cleanly exited the except block. This 11:21:45 # ensures we do proper cleanup in finally. 11:21:45 clean_exit = False 11:21:45 11:21:45 # Rewind body position, if needed. Record current position 11:21:45 # for future rewinds in the event of a redirect/retry. 11:21:45 body_pos = set_file_position(body, body_pos) 11:21:45 11:21:45 try: 11:21:45 # Request a connection from the queue. 11:21:45 timeout_obj = self._get_timeout(timeout) 11:21:45 conn = self._get_conn(timeout=pool_timeout) 11:21:45 11:21:45 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:45 11:21:45 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:45 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:45 try: 11:21:45 self._prepare_proxy(conn) 11:21:45 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:45 self._raise_timeout( 11:21:45 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:45 ) 11:21:45 raise 11:21:45 11:21:45 # If we're going to release the connection in ``finally:``, then 11:21:45 # the response doesn't need to know about the connection. Otherwise 11:21:45 # it will also try to release it and we'll have a double-release 11:21:45 # mess. 11:21:45 response_conn = conn if not release_conn else None 11:21:45 11:21:45 # Make the request on the HTTPConnection object 11:21:45 > response = self._make_request( 11:21:45 conn, 11:21:45 method, 11:21:45 url, 11:21:45 timeout=timeout_obj, 11:21:45 body=body, 11:21:45 headers=headers, 11:21:45 chunked=chunked, 11:21:45 retries=retries, 11:21:45 response_conn=response_conn, 11:21:45 preload_content=preload_content, 11:21:45 decode_content=decode_content, 11:21:45 **response_kw, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:45 conn.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:45 self.endheaders() 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:45 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:45 self.send(msg) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:45 self.connect() 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:45 self.sock = self._new_conn() 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 except socket.gaierror as e: 11:21:45 raise NameResolutionError(self.host, self, e) from e 11:21:45 except SocketTimeout as e: 11:21:45 raise ConnectTimeoutError( 11:21:45 self, 11:21:45 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:45 ) from e 11:21:45 11:21:45 except OSError as e: 11:21:45 > raise NewConnectionError( 11:21:45 self, f"Failed to establish a new connection: {e}" 11:21:45 ) from e 11:21:45 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 > resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:45 retries = retries.increment( 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 method = 'POST' 11:21:45 url = '/rests/operations/transportpce-networkutils:init-rdm-xpdr-links' 11:21:45 response = None 11:21:45 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:45 _pool = 11:21:45 _stacktrace = 11:21:45 11:21:45 def increment( 11:21:45 self, 11:21:45 method: str | None = None, 11:21:45 url: str | None = None, 11:21:45 response: BaseHTTPResponse | None = None, 11:21:45 error: Exception | None = None, 11:21:45 _pool: ConnectionPool | None = None, 11:21:45 _stacktrace: TracebackType | None = None, 11:21:45 ) -> Self: 11:21:45 """Return a new Retry object with incremented retry counters. 11:21:45 11:21:45 :param response: A response object, or None, if the server did not 11:21:45 return a response. 11:21:45 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:45 :param Exception error: An error encountered during the request, or 11:21:45 None if the response was received successfully. 11:21:45 11:21:45 :return: A new ``Retry`` object. 11:21:45 """ 11:21:45 if self.total is False and error: 11:21:45 # Disabled, indicate to re-raise the error. 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 11:21:45 total = self.total 11:21:45 if total is not None: 11:21:45 total -= 1 11:21:45 11:21:45 connect = self.connect 11:21:45 read = self.read 11:21:45 redirect = self.redirect 11:21:45 status_count = self.status 11:21:45 other = self.other 11:21:45 cause = "unknown" 11:21:45 status = None 11:21:45 redirect_location = None 11:21:45 11:21:45 if error and self._is_connection_error(error): 11:21:45 # Connect retry? 11:21:45 if connect is False: 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif connect is not None: 11:21:45 connect -= 1 11:21:45 11:21:45 elif error and self._is_read_error(error): 11:21:45 # Read retry? 11:21:45 if read is False or method is None or not self._is_method_retryable(method): 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif read is not None: 11:21:45 read -= 1 11:21:45 11:21:45 elif error: 11:21:45 # Other retry? 11:21:45 if other is not None: 11:21:45 other -= 1 11:21:45 11:21:45 elif response and response.get_redirect_location(): 11:21:45 # Redirect retry? 11:21:45 if redirect is not None: 11:21:45 redirect -= 1 11:21:45 cause = "too many redirects" 11:21:45 response_redirect_location = response.get_redirect_location() 11:21:45 if response_redirect_location: 11:21:45 redirect_location = response_redirect_location 11:21:45 status = response.status 11:21:45 11:21:45 else: 11:21:45 # Incrementing because of a server error like a 500 in 11:21:45 # status_forcelist and the given method is in the allowed_methods 11:21:45 cause = ResponseError.GENERIC_ERROR 11:21:45 if response and response.status: 11:21:45 if status_count is not None: 11:21:45 status_count -= 1 11:21:45 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:45 status = response.status 11:21:45 11:21:45 history = self.history + ( 11:21:45 RequestHistory(method, url, error, status, redirect_location), 11:21:45 ) 11:21:45 11:21:45 new_retry = self.new( 11:21:45 total=total, 11:21:45 connect=connect, 11:21:45 read=read, 11:21:45 redirect=redirect, 11:21:45 status=status_count, 11:21:45 other=other, 11:21:45 history=history, 11:21:45 ) 11:21:45 11:21:45 if new_retry.is_exhausted(): 11:21:45 reason = error or ResponseError(cause) 11:21:45 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:45 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-rdm-xpdr-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:45 11:21:45 During handling of the above exception, another exception occurred: 11:21:45 11:21:45 self = 11:21:45 11:21:45 def test_08_connect_roadmC_PP1_to_xpdrC_N1(self): 11:21:45 > response = test_utils.transportpce_api_rpc_request( 11:21:45 'transportpce-networkutils', 'init-rdm-xpdr-links', 11:21:45 {'links-input': {'xpdr-node': 'XPDRC01', 'xpdr-num': '1', 'network-num': '1', 11:21:45 'rdm-node': 'ROADMC01', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX'}}) 11:21:45 11:21:45 transportpce_tests/1.2.1/test06_end2end.py:137: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 transportpce_tests/common/test_utils.py:687: in transportpce_api_rpc_request 11:21:45 response = post_request(url, data) 11:21:45 transportpce_tests/common/test_utils.py:142: in post_request 11:21:45 return requests.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:45 return session.request(method=method, url=url, **kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:45 resp = self.send(prep, **send_kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:45 r = adapter.send(request, **kwargs) 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 except (ProtocolError, OSError) as err: 11:21:45 raise ConnectionError(err, request=request) 11:21:45 11:21:45 except MaxRetryError as e: 11:21:45 if isinstance(e.reason, ConnectTimeoutError): 11:21:45 # TODO: Remove this in 3.0.0: see #2811 11:21:45 if not isinstance(e.reason, NewConnectionError): 11:21:45 raise ConnectTimeout(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, ResponseError): 11:21:45 raise RetryError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _ProxyError): 11:21:45 raise ProxyError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _SSLError): 11:21:45 # This branch is for urllib3 v1.22 and later. 11:21:45 raise SSLError(e, request=request) 11:21:45 11:21:45 > raise ConnectionError(e, request=request) 11:21:45 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-rdm-xpdr-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:45 ----------------------------- Captured stdout call ----------------------------- 11:21:45 execution of test_08_connect_roadmC_PP1_to_xpdrC_N1 11:21:45 _______ TransportPCEFulltesting.test_09_add_omsAttributes_ROADMA_ROADMC ________ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 > sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:45 raise err 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:45 socket_options = [(6, 1, 1)] 11:21:45 11:21:45 def create_connection( 11:21:45 address: tuple[str, int], 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 source_address: tuple[str, int] | None = None, 11:21:45 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:45 ) -> socket.socket: 11:21:45 """Connect to *address* and return the socket object. 11:21:45 11:21:45 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:45 port)``) and return the socket object. Passing the optional 11:21:45 *timeout* parameter will set the timeout on the socket instance 11:21:45 before attempting to connect. If no *timeout* is supplied, the 11:21:45 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:45 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:45 for the socket to bind as a source address before making the connection. 11:21:45 An host of '' or port 0 tells the OS to use the default. 11:21:45 """ 11:21:45 11:21:45 host, port = address 11:21:45 if host.startswith("["): 11:21:45 host = host.strip("[]") 11:21:45 err = None 11:21:45 11:21:45 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:45 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:45 # The original create_connection function always returns all records. 11:21:45 family = allowed_gai_family() 11:21:45 11:21:45 try: 11:21:45 host.encode("idna") 11:21:45 except UnicodeError: 11:21:45 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:45 11:21:45 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:45 af, socktype, proto, canonname, sa = res 11:21:45 sock = None 11:21:45 try: 11:21:45 sock = socket.socket(af, socktype, proto) 11:21:45 11:21:45 # If provided, set socket level options before connecting. 11:21:45 _set_socket_options(sock, socket_options) 11:21:45 11:21:45 if timeout is not _DEFAULT_TIMEOUT: 11:21:45 sock.settimeout(timeout) 11:21:45 if source_address: 11:21:45 sock.bind(source_address) 11:21:45 > sock.connect(sa) 11:21:45 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 method = 'PUT' 11:21:45 url = '/rests/data/ietf-network:networks/network=openroadm-topology/ietf-network-topology:link=ROADMA01-DEG1-DEG1-TTP-TXRXtoROADMC01-DEG2-DEG2-TTP-TXRX/org-openroadm-network-topology:OMS-attributes/span' 11:21:45 body = '{"span": {"auto-spanloss": "true", "spanloss-base": 11.4, "spanloss-current": 12, "engineered-spanloss": 12.2, "link-concatenation": [{"SRLG-Id": 0, "fiber-type": "smf", "SRLG-length": 100000, "pmd": 0.5}]}}' 11:21:45 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '207', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:45 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 redirect = False, assert_same_host = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:45 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:45 decode_content = False, response_kw = {} 11:21:45 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/ietf-network:networks/network=openroadm-topology/i...1-TTP-TXRXtoROADMC01-DEG2-DEG2-TTP-TXRX/org-openroadm-network-topology:OMS-attributes/span', query=None, fragment=None) 11:21:45 destination_scheme = None, conn = None, release_this_conn = True 11:21:45 http_tunnel_required = False, err = None, clean_exit = False 11:21:45 11:21:45 def urlopen( # type: ignore[override] 11:21:45 self, 11:21:45 method: str, 11:21:45 url: str, 11:21:45 body: _TYPE_BODY | None = None, 11:21:45 headers: typing.Mapping[str, str] | None = None, 11:21:45 retries: Retry | bool | int | None = None, 11:21:45 redirect: bool = True, 11:21:45 assert_same_host: bool = True, 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 pool_timeout: int | None = None, 11:21:45 release_conn: bool | None = None, 11:21:45 chunked: bool = False, 11:21:45 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:45 preload_content: bool = True, 11:21:45 decode_content: bool = True, 11:21:45 **response_kw: typing.Any, 11:21:45 ) -> BaseHTTPResponse: 11:21:45 """ 11:21:45 Get a connection from the pool and perform an HTTP request. This is the 11:21:45 lowest level call for making a request, so you'll need to specify all 11:21:45 the raw details. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 More commonly, it's appropriate to use a convenience method 11:21:45 such as :meth:`request`. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 `release_conn` will only behave as expected if 11:21:45 `preload_content=False` because we want to make 11:21:45 `preload_content=False` the default behaviour someday soon without 11:21:45 breaking backwards compatibility. 11:21:45 11:21:45 :param method: 11:21:45 HTTP request method (such as GET, POST, PUT, etc.) 11:21:45 11:21:45 :param url: 11:21:45 The URL to perform the request on. 11:21:45 11:21:45 :param body: 11:21:45 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:45 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:45 11:21:45 :param headers: 11:21:45 Dictionary of custom headers to send, such as User-Agent, 11:21:45 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:45 these headers completely replace any pool-specific headers. 11:21:45 11:21:45 :param retries: 11:21:45 Configure the number of retries to allow before raising a 11:21:45 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:45 11:21:45 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:45 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:45 over different types of retries. 11:21:45 Pass an integer number to retry connection errors that many times, 11:21:45 but no other types of errors. Pass zero to never retry. 11:21:45 11:21:45 If ``False``, then retries are disabled and any exception is raised 11:21:45 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:45 the redirect response will be returned. 11:21:45 11:21:45 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:45 11:21:45 :param redirect: 11:21:45 If True, automatically handle redirects (status codes 301, 302, 11:21:45 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:45 will disable redirect, too. 11:21:45 11:21:45 :param assert_same_host: 11:21:45 If ``True``, will make sure that the host of the pool requests is 11:21:45 consistent else will raise HostChangedError. When ``False``, you can 11:21:45 use the pool on an HTTP proxy and request foreign hosts. 11:21:45 11:21:45 :param timeout: 11:21:45 If specified, overrides the default timeout for this one 11:21:45 request. It may be a float (in seconds) or an instance of 11:21:45 :class:`urllib3.util.Timeout`. 11:21:45 11:21:45 :param pool_timeout: 11:21:45 If set and the pool is set to block=True, then this method will 11:21:45 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:45 connection is available within the time period. 11:21:45 11:21:45 :param bool preload_content: 11:21:45 If True, the response's body will be preloaded into memory. 11:21:45 11:21:45 :param bool decode_content: 11:21:45 If True, will attempt to decode the body based on the 11:21:45 'content-encoding' header. 11:21:45 11:21:45 :param release_conn: 11:21:45 If False, then the urlopen call will not release the connection 11:21:45 back into the pool once a response is received (but will release if 11:21:45 you read the entire contents of the response such as when 11:21:45 `preload_content=True`). This is useful if you're not preloading 11:21:45 the response's content immediately. You will need to call 11:21:45 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:45 back into the pool. If None, it takes the value of ``preload_content`` 11:21:45 which defaults to ``True``. 11:21:45 11:21:45 :param bool chunked: 11:21:45 If True, urllib3 will send the body using chunked transfer 11:21:45 encoding. Otherwise, urllib3 will send the body using the standard 11:21:45 content-length form. Defaults to False. 11:21:45 11:21:45 :param int body_pos: 11:21:45 Position to seek to in file-like body in the event of a retry or 11:21:45 redirect. Typically this won't need to be set because urllib3 will 11:21:45 auto-populate the value when needed. 11:21:45 """ 11:21:45 parsed_url = parse_url(url) 11:21:45 destination_scheme = parsed_url.scheme 11:21:45 11:21:45 if headers is None: 11:21:45 headers = self.headers 11:21:45 11:21:45 if not isinstance(retries, Retry): 11:21:45 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:45 11:21:45 if release_conn is None: 11:21:45 release_conn = preload_content 11:21:45 11:21:45 # Check host 11:21:45 if assert_same_host and not self.is_same_host(url): 11:21:45 raise HostChangedError(self, url, retries) 11:21:45 11:21:45 # Ensure that the URL we're connecting to is properly encoded 11:21:45 if url.startswith("/"): 11:21:45 url = to_str(_encode_target(url)) 11:21:45 else: 11:21:45 url = to_str(parsed_url.url) 11:21:45 11:21:45 conn = None 11:21:45 11:21:45 # Track whether `conn` needs to be released before 11:21:45 # returning/raising/recursing. Update this variable if necessary, and 11:21:45 # leave `release_conn` constant throughout the function. That way, if 11:21:45 # the function recurses, the original value of `release_conn` will be 11:21:45 # passed down into the recursive call, and its value will be respected. 11:21:45 # 11:21:45 # See issue #651 [1] for details. 11:21:45 # 11:21:45 # [1] 11:21:45 release_this_conn = release_conn 11:21:45 11:21:45 http_tunnel_required = connection_requires_http_tunnel( 11:21:45 self.proxy, self.proxy_config, destination_scheme 11:21:45 ) 11:21:45 11:21:45 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:45 # have to copy the headers dict so we can safely change it without those 11:21:45 # changes being reflected in anyone else's copy. 11:21:45 if not http_tunnel_required: 11:21:45 headers = headers.copy() # type: ignore[attr-defined] 11:21:45 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:45 11:21:45 # Must keep the exception bound to a separate variable or else Python 3 11:21:45 # complains about UnboundLocalError. 11:21:45 err = None 11:21:45 11:21:45 # Keep track of whether we cleanly exited the except block. This 11:21:45 # ensures we do proper cleanup in finally. 11:21:45 clean_exit = False 11:21:45 11:21:45 # Rewind body position, if needed. Record current position 11:21:45 # for future rewinds in the event of a redirect/retry. 11:21:45 body_pos = set_file_position(body, body_pos) 11:21:45 11:21:45 try: 11:21:45 # Request a connection from the queue. 11:21:45 timeout_obj = self._get_timeout(timeout) 11:21:45 conn = self._get_conn(timeout=pool_timeout) 11:21:45 11:21:45 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:45 11:21:45 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:45 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:45 try: 11:21:45 self._prepare_proxy(conn) 11:21:45 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:45 self._raise_timeout( 11:21:45 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:45 ) 11:21:45 raise 11:21:45 11:21:45 # If we're going to release the connection in ``finally:``, then 11:21:45 # the response doesn't need to know about the connection. Otherwise 11:21:45 # it will also try to release it and we'll have a double-release 11:21:45 # mess. 11:21:45 response_conn = conn if not release_conn else None 11:21:45 11:21:45 # Make the request on the HTTPConnection object 11:21:45 > response = self._make_request( 11:21:45 conn, 11:21:45 method, 11:21:45 url, 11:21:45 timeout=timeout_obj, 11:21:45 body=body, 11:21:45 headers=headers, 11:21:45 chunked=chunked, 11:21:45 retries=retries, 11:21:45 response_conn=response_conn, 11:21:45 preload_content=preload_content, 11:21:45 decode_content=decode_content, 11:21:45 **response_kw, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:45 conn.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:45 self.endheaders() 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:45 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:45 self.send(msg) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:45 self.connect() 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:45 self.sock = self._new_conn() 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 except socket.gaierror as e: 11:21:45 raise NameResolutionError(self.host, self, e) from e 11:21:45 except SocketTimeout as e: 11:21:45 raise ConnectTimeoutError( 11:21:45 self, 11:21:45 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:45 ) from e 11:21:45 11:21:45 except OSError as e: 11:21:45 > raise NewConnectionError( 11:21:45 self, f"Failed to establish a new connection: {e}" 11:21:45 ) from e 11:21:45 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 > resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:45 retries = retries.increment( 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 method = 'PUT' 11:21:45 url = '/rests/data/ietf-network:networks/network=openroadm-topology/ietf-network-topology:link=ROADMA01-DEG1-DEG1-TTP-TXRXtoROADMC01-DEG2-DEG2-TTP-TXRX/org-openroadm-network-topology:OMS-attributes/span' 11:21:45 response = None 11:21:45 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:45 _pool = 11:21:45 _stacktrace = 11:21:45 11:21:45 def increment( 11:21:45 self, 11:21:45 method: str | None = None, 11:21:45 url: str | None = None, 11:21:45 response: BaseHTTPResponse | None = None, 11:21:45 error: Exception | None = None, 11:21:45 _pool: ConnectionPool | None = None, 11:21:45 _stacktrace: TracebackType | None = None, 11:21:45 ) -> Self: 11:21:45 """Return a new Retry object with incremented retry counters. 11:21:45 11:21:45 :param response: A response object, or None, if the server did not 11:21:45 return a response. 11:21:45 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:45 :param Exception error: An error encountered during the request, or 11:21:45 None if the response was received successfully. 11:21:45 11:21:45 :return: A new ``Retry`` object. 11:21:45 """ 11:21:45 if self.total is False and error: 11:21:45 # Disabled, indicate to re-raise the error. 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 11:21:45 total = self.total 11:21:45 if total is not None: 11:21:45 total -= 1 11:21:45 11:21:45 connect = self.connect 11:21:45 read = self.read 11:21:45 redirect = self.redirect 11:21:45 status_count = self.status 11:21:45 other = self.other 11:21:45 cause = "unknown" 11:21:45 status = None 11:21:45 redirect_location = None 11:21:45 11:21:45 if error and self._is_connection_error(error): 11:21:45 # Connect retry? 11:21:45 if connect is False: 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif connect is not None: 11:21:45 connect -= 1 11:21:45 11:21:45 elif error and self._is_read_error(error): 11:21:45 # Read retry? 11:21:45 if read is False or method is None or not self._is_method_retryable(method): 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif read is not None: 11:21:45 read -= 1 11:21:45 11:21:45 elif error: 11:21:45 # Other retry? 11:21:45 if other is not None: 11:21:45 other -= 1 11:21:45 11:21:45 elif response and response.get_redirect_location(): 11:21:45 # Redirect retry? 11:21:45 if redirect is not None: 11:21:45 redirect -= 1 11:21:45 cause = "too many redirects" 11:21:45 response_redirect_location = response.get_redirect_location() 11:21:45 if response_redirect_location: 11:21:45 redirect_location = response_redirect_location 11:21:45 status = response.status 11:21:45 11:21:45 else: 11:21:45 # Incrementing because of a server error like a 500 in 11:21:45 # status_forcelist and the given method is in the allowed_methods 11:21:45 cause = ResponseError.GENERIC_ERROR 11:21:45 if response and response.status: 11:21:45 if status_count is not None: 11:21:45 status_count -= 1 11:21:45 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:45 status = response.status 11:21:45 11:21:45 history = self.history + ( 11:21:45 RequestHistory(method, url, error, status, redirect_location), 11:21:45 ) 11:21:45 11:21:45 new_retry = self.new( 11:21:45 total=total, 11:21:45 connect=connect, 11:21:45 read=read, 11:21:45 redirect=redirect, 11:21:45 status=status_count, 11:21:45 other=other, 11:21:45 history=history, 11:21:45 ) 11:21:45 11:21:45 if new_retry.is_exhausted(): 11:21:45 reason = error or ResponseError(cause) 11:21:45 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:45 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/ietf-network-topology:link=ROADMA01-DEG1-DEG1-TTP-TXRXtoROADMC01-DEG2-DEG2-TTP-TXRX/org-openroadm-network-topology:OMS-attributes/span (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:45 11:21:45 During handling of the above exception, another exception occurred: 11:21:45 11:21:45 self = 11:21:45 11:21:45 def test_09_add_omsAttributes_ROADMA_ROADMC(self): 11:21:45 # Config ROADMA-ROADMC oms-attributes 11:21:45 data = {"span": { 11:21:45 "auto-spanloss": "true", 11:21:45 "spanloss-base": 11.4, 11:21:45 "spanloss-current": 12, 11:21:45 "engineered-spanloss": 12.2, 11:21:45 "link-concatenation": [{ 11:21:45 "SRLG-Id": 0, 11:21:45 "fiber-type": "smf", 11:21:45 "SRLG-length": 100000, 11:21:45 "pmd": 0.5}]}} 11:21:45 > response = test_utils.add_oms_attr_request("ROADMA01-DEG1-DEG1-TTP-TXRXtoROADMC01-DEG2-DEG2-TTP-TXRX", 11:21:45 data) 11:21:45 11:21:45 transportpce_tests/1.2.1/test06_end2end.py:157: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 transportpce_tests/common/test_utils.py:561: in add_oms_attr_request 11:21:45 response = put_request(url2.format('{}', network, link), oms_attr) 11:21:45 transportpce_tests/common/test_utils.py:124: in put_request 11:21:45 return requests.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:45 return session.request(method=method, url=url, **kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:45 resp = self.send(prep, **send_kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:45 r = adapter.send(request, **kwargs) 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 except (ProtocolError, OSError) as err: 11:21:45 raise ConnectionError(err, request=request) 11:21:45 11:21:45 except MaxRetryError as e: 11:21:45 if isinstance(e.reason, ConnectTimeoutError): 11:21:45 # TODO: Remove this in 3.0.0: see #2811 11:21:45 if not isinstance(e.reason, NewConnectionError): 11:21:45 raise ConnectTimeout(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, ResponseError): 11:21:45 raise RetryError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _ProxyError): 11:21:45 raise ProxyError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _SSLError): 11:21:45 # This branch is for urllib3 v1.22 and later. 11:21:45 raise SSLError(e, request=request) 11:21:45 11:21:45 > raise ConnectionError(e, request=request) 11:21:45 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/ietf-network-topology:link=ROADMA01-DEG1-DEG1-TTP-TXRXtoROADMC01-DEG2-DEG2-TTP-TXRX/org-openroadm-network-topology:OMS-attributes/span (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:45 ----------------------------- Captured stdout call ----------------------------- 11:21:45 execution of test_09_add_omsAttributes_ROADMA_ROADMC 11:21:45 _______ TransportPCEFulltesting.test_10_add_omsAttributes_ROADMC_ROADMA ________ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 > sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:45 raise err 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:45 socket_options = [(6, 1, 1)] 11:21:45 11:21:45 def create_connection( 11:21:45 address: tuple[str, int], 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 source_address: tuple[str, int] | None = None, 11:21:45 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:45 ) -> socket.socket: 11:21:45 """Connect to *address* and return the socket object. 11:21:45 11:21:45 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:45 port)``) and return the socket object. Passing the optional 11:21:45 *timeout* parameter will set the timeout on the socket instance 11:21:45 before attempting to connect. If no *timeout* is supplied, the 11:21:45 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:45 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:45 for the socket to bind as a source address before making the connection. 11:21:45 An host of '' or port 0 tells the OS to use the default. 11:21:45 """ 11:21:45 11:21:45 host, port = address 11:21:45 if host.startswith("["): 11:21:45 host = host.strip("[]") 11:21:45 err = None 11:21:45 11:21:45 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:45 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:45 # The original create_connection function always returns all records. 11:21:45 family = allowed_gai_family() 11:21:45 11:21:45 try: 11:21:45 host.encode("idna") 11:21:45 except UnicodeError: 11:21:45 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:45 11:21:45 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:45 af, socktype, proto, canonname, sa = res 11:21:45 sock = None 11:21:45 try: 11:21:45 sock = socket.socket(af, socktype, proto) 11:21:45 11:21:45 # If provided, set socket level options before connecting. 11:21:45 _set_socket_options(sock, socket_options) 11:21:45 11:21:45 if timeout is not _DEFAULT_TIMEOUT: 11:21:45 sock.settimeout(timeout) 11:21:45 if source_address: 11:21:45 sock.bind(source_address) 11:21:45 > sock.connect(sa) 11:21:45 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 method = 'PUT' 11:21:45 url = '/rests/data/ietf-network:networks/network=openroadm-topology/ietf-network-topology:link=ROADMC01-DEG2-DEG2-TTP-TXRXtoROADMA01-DEG1-DEG1-TTP-TXRX/org-openroadm-network-topology:OMS-attributes/span' 11:21:45 body = '{"span": {"auto-spanloss": "true", "spanloss-base": 11.4, "spanloss-current": 12, "engineered-spanloss": 12.2, "link-concatenation": [{"SRLG-Id": 0, "fiber-type": "smf", "SRLG-length": 100000, "pmd": 0.5}]}}' 11:21:45 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '207', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:45 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 redirect = False, assert_same_host = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:45 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:45 decode_content = False, response_kw = {} 11:21:45 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/ietf-network:networks/network=openroadm-topology/i...2-TTP-TXRXtoROADMA01-DEG1-DEG1-TTP-TXRX/org-openroadm-network-topology:OMS-attributes/span', query=None, fragment=None) 11:21:45 destination_scheme = None, conn = None, release_this_conn = True 11:21:45 http_tunnel_required = False, err = None, clean_exit = False 11:21:45 11:21:45 def urlopen( # type: ignore[override] 11:21:45 self, 11:21:45 method: str, 11:21:45 url: str, 11:21:45 body: _TYPE_BODY | None = None, 11:21:45 headers: typing.Mapping[str, str] | None = None, 11:21:45 retries: Retry | bool | int | None = None, 11:21:45 redirect: bool = True, 11:21:45 assert_same_host: bool = True, 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 pool_timeout: int | None = None, 11:21:45 release_conn: bool | None = None, 11:21:45 chunked: bool = False, 11:21:45 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:45 preload_content: bool = True, 11:21:45 decode_content: bool = True, 11:21:45 **response_kw: typing.Any, 11:21:45 ) -> BaseHTTPResponse: 11:21:45 """ 11:21:45 Get a connection from the pool and perform an HTTP request. This is the 11:21:45 lowest level call for making a request, so you'll need to specify all 11:21:45 the raw details. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 More commonly, it's appropriate to use a convenience method 11:21:45 such as :meth:`request`. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 `release_conn` will only behave as expected if 11:21:45 `preload_content=False` because we want to make 11:21:45 `preload_content=False` the default behaviour someday soon without 11:21:45 breaking backwards compatibility. 11:21:45 11:21:45 :param method: 11:21:45 HTTP request method (such as GET, POST, PUT, etc.) 11:21:45 11:21:45 :param url: 11:21:45 The URL to perform the request on. 11:21:45 11:21:45 :param body: 11:21:45 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:45 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:45 11:21:45 :param headers: 11:21:45 Dictionary of custom headers to send, such as User-Agent, 11:21:45 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:45 these headers completely replace any pool-specific headers. 11:21:45 11:21:45 :param retries: 11:21:45 Configure the number of retries to allow before raising a 11:21:45 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:45 11:21:45 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:45 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:45 over different types of retries. 11:21:45 Pass an integer number to retry connection errors that many times, 11:21:45 but no other types of errors. Pass zero to never retry. 11:21:45 11:21:45 If ``False``, then retries are disabled and any exception is raised 11:21:45 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:45 the redirect response will be returned. 11:21:45 11:21:45 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:45 11:21:45 :param redirect: 11:21:45 If True, automatically handle redirects (status codes 301, 302, 11:21:45 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:45 will disable redirect, too. 11:21:45 11:21:45 :param assert_same_host: 11:21:45 If ``True``, will make sure that the host of the pool requests is 11:21:45 consistent else will raise HostChangedError. When ``False``, you can 11:21:45 use the pool on an HTTP proxy and request foreign hosts. 11:21:45 11:21:45 :param timeout: 11:21:45 If specified, overrides the default timeout for this one 11:21:45 request. It may be a float (in seconds) or an instance of 11:21:45 :class:`urllib3.util.Timeout`. 11:21:45 11:21:45 :param pool_timeout: 11:21:45 If set and the pool is set to block=True, then this method will 11:21:45 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:45 connection is available within the time period. 11:21:45 11:21:45 :param bool preload_content: 11:21:45 If True, the response's body will be preloaded into memory. 11:21:45 11:21:45 :param bool decode_content: 11:21:45 If True, will attempt to decode the body based on the 11:21:45 'content-encoding' header. 11:21:45 11:21:45 :param release_conn: 11:21:45 If False, then the urlopen call will not release the connection 11:21:45 back into the pool once a response is received (but will release if 11:21:45 you read the entire contents of the response such as when 11:21:45 `preload_content=True`). This is useful if you're not preloading 11:21:45 the response's content immediately. You will need to call 11:21:45 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:45 back into the pool. If None, it takes the value of ``preload_content`` 11:21:45 which defaults to ``True``. 11:21:45 11:21:45 :param bool chunked: 11:21:45 If True, urllib3 will send the body using chunked transfer 11:21:45 encoding. Otherwise, urllib3 will send the body using the standard 11:21:45 content-length form. Defaults to False. 11:21:45 11:21:45 :param int body_pos: 11:21:45 Position to seek to in file-like body in the event of a retry or 11:21:45 redirect. Typically this won't need to be set because urllib3 will 11:21:45 auto-populate the value when needed. 11:21:45 """ 11:21:45 parsed_url = parse_url(url) 11:21:45 destination_scheme = parsed_url.scheme 11:21:45 11:21:45 if headers is None: 11:21:45 headers = self.headers 11:21:45 11:21:45 if not isinstance(retries, Retry): 11:21:45 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:45 11:21:45 if release_conn is None: 11:21:45 release_conn = preload_content 11:21:45 11:21:45 # Check host 11:21:45 if assert_same_host and not self.is_same_host(url): 11:21:45 raise HostChangedError(self, url, retries) 11:21:45 11:21:45 # Ensure that the URL we're connecting to is properly encoded 11:21:45 if url.startswith("/"): 11:21:45 url = to_str(_encode_target(url)) 11:21:45 else: 11:21:45 url = to_str(parsed_url.url) 11:21:45 11:21:45 conn = None 11:21:45 11:21:45 # Track whether `conn` needs to be released before 11:21:45 # returning/raising/recursing. Update this variable if necessary, and 11:21:45 # leave `release_conn` constant throughout the function. That way, if 11:21:45 # the function recurses, the original value of `release_conn` will be 11:21:45 # passed down into the recursive call, and its value will be respected. 11:21:45 # 11:21:45 # See issue #651 [1] for details. 11:21:45 # 11:21:45 # [1] 11:21:45 release_this_conn = release_conn 11:21:45 11:21:45 http_tunnel_required = connection_requires_http_tunnel( 11:21:45 self.proxy, self.proxy_config, destination_scheme 11:21:45 ) 11:21:45 11:21:45 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:45 # have to copy the headers dict so we can safely change it without those 11:21:45 # changes being reflected in anyone else's copy. 11:21:45 if not http_tunnel_required: 11:21:45 headers = headers.copy() # type: ignore[attr-defined] 11:21:45 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:45 11:21:45 # Must keep the exception bound to a separate variable or else Python 3 11:21:45 # complains about UnboundLocalError. 11:21:45 err = None 11:21:45 11:21:45 # Keep track of whether we cleanly exited the except block. This 11:21:45 # ensures we do proper cleanup in finally. 11:21:45 clean_exit = False 11:21:45 11:21:45 # Rewind body position, if needed. Record current position 11:21:45 # for future rewinds in the event of a redirect/retry. 11:21:45 body_pos = set_file_position(body, body_pos) 11:21:45 11:21:45 try: 11:21:45 # Request a connection from the queue. 11:21:45 timeout_obj = self._get_timeout(timeout) 11:21:45 conn = self._get_conn(timeout=pool_timeout) 11:21:45 11:21:45 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:45 11:21:45 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:45 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:45 try: 11:21:45 self._prepare_proxy(conn) 11:21:45 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:45 self._raise_timeout( 11:21:45 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:45 ) 11:21:45 raise 11:21:45 11:21:45 # If we're going to release the connection in ``finally:``, then 11:21:45 # the response doesn't need to know about the connection. Otherwise 11:21:45 # it will also try to release it and we'll have a double-release 11:21:45 # mess. 11:21:45 response_conn = conn if not release_conn else None 11:21:45 11:21:45 # Make the request on the HTTPConnection object 11:21:45 > response = self._make_request( 11:21:45 conn, 11:21:45 method, 11:21:45 url, 11:21:45 timeout=timeout_obj, 11:21:45 body=body, 11:21:45 headers=headers, 11:21:45 chunked=chunked, 11:21:45 retries=retries, 11:21:45 response_conn=response_conn, 11:21:45 preload_content=preload_content, 11:21:45 decode_content=decode_content, 11:21:45 **response_kw, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:45 conn.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:45 self.endheaders() 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:45 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:45 self.send(msg) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:45 self.connect() 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:45 self.sock = self._new_conn() 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 except socket.gaierror as e: 11:21:45 raise NameResolutionError(self.host, self, e) from e 11:21:45 except SocketTimeout as e: 11:21:45 raise ConnectTimeoutError( 11:21:45 self, 11:21:45 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:45 ) from e 11:21:45 11:21:45 except OSError as e: 11:21:45 > raise NewConnectionError( 11:21:45 self, f"Failed to establish a new connection: {e}" 11:21:45 ) from e 11:21:45 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 > resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:45 retries = retries.increment( 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 method = 'PUT' 11:21:45 url = '/rests/data/ietf-network:networks/network=openroadm-topology/ietf-network-topology:link=ROADMC01-DEG2-DEG2-TTP-TXRXtoROADMA01-DEG1-DEG1-TTP-TXRX/org-openroadm-network-topology:OMS-attributes/span' 11:21:45 response = None 11:21:45 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:45 _pool = 11:21:45 _stacktrace = 11:21:45 11:21:45 def increment( 11:21:45 self, 11:21:45 method: str | None = None, 11:21:45 url: str | None = None, 11:21:45 response: BaseHTTPResponse | None = None, 11:21:45 error: Exception | None = None, 11:21:45 _pool: ConnectionPool | None = None, 11:21:45 _stacktrace: TracebackType | None = None, 11:21:45 ) -> Self: 11:21:45 """Return a new Retry object with incremented retry counters. 11:21:45 11:21:45 :param response: A response object, or None, if the server did not 11:21:45 return a response. 11:21:45 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:45 :param Exception error: An error encountered during the request, or 11:21:45 None if the response was received successfully. 11:21:45 11:21:45 :return: A new ``Retry`` object. 11:21:45 """ 11:21:45 if self.total is False and error: 11:21:45 # Disabled, indicate to re-raise the error. 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 11:21:45 total = self.total 11:21:45 if total is not None: 11:21:45 total -= 1 11:21:45 11:21:45 connect = self.connect 11:21:45 read = self.read 11:21:45 redirect = self.redirect 11:21:45 status_count = self.status 11:21:45 other = self.other 11:21:45 cause = "unknown" 11:21:45 status = None 11:21:45 redirect_location = None 11:21:45 11:21:45 if error and self._is_connection_error(error): 11:21:45 # Connect retry? 11:21:45 if connect is False: 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif connect is not None: 11:21:45 connect -= 1 11:21:45 11:21:45 elif error and self._is_read_error(error): 11:21:45 # Read retry? 11:21:45 if read is False or method is None or not self._is_method_retryable(method): 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif read is not None: 11:21:45 read -= 1 11:21:45 11:21:45 elif error: 11:21:45 # Other retry? 11:21:45 if other is not None: 11:21:45 other -= 1 11:21:45 11:21:45 elif response and response.get_redirect_location(): 11:21:45 # Redirect retry? 11:21:45 if redirect is not None: 11:21:45 redirect -= 1 11:21:45 cause = "too many redirects" 11:21:45 response_redirect_location = response.get_redirect_location() 11:21:45 if response_redirect_location: 11:21:45 redirect_location = response_redirect_location 11:21:45 status = response.status 11:21:45 11:21:45 else: 11:21:45 # Incrementing because of a server error like a 500 in 11:21:45 # status_forcelist and the given method is in the allowed_methods 11:21:45 cause = ResponseError.GENERIC_ERROR 11:21:45 if response and response.status: 11:21:45 if status_count is not None: 11:21:45 status_count -= 1 11:21:45 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:45 status = response.status 11:21:45 11:21:45 history = self.history + ( 11:21:45 RequestHistory(method, url, error, status, redirect_location), 11:21:45 ) 11:21:45 11:21:45 new_retry = self.new( 11:21:45 total=total, 11:21:45 connect=connect, 11:21:45 read=read, 11:21:45 redirect=redirect, 11:21:45 status=status_count, 11:21:45 other=other, 11:21:45 history=history, 11:21:45 ) 11:21:45 11:21:45 if new_retry.is_exhausted(): 11:21:45 reason = error or ResponseError(cause) 11:21:45 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:45 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/ietf-network-topology:link=ROADMC01-DEG2-DEG2-TTP-TXRXtoROADMA01-DEG1-DEG1-TTP-TXRX/org-openroadm-network-topology:OMS-attributes/span (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:45 11:21:45 During handling of the above exception, another exception occurred: 11:21:45 11:21:45 self = 11:21:45 11:21:45 def test_10_add_omsAttributes_ROADMC_ROADMA(self): 11:21:45 # Config ROADMC-ROADMA oms-attributes 11:21:45 data = {"span": { 11:21:45 "auto-spanloss": "true", 11:21:45 "spanloss-base": 11.4, 11:21:45 "spanloss-current": 12, 11:21:45 "engineered-spanloss": 12.2, 11:21:45 "link-concatenation": [{ 11:21:45 "SRLG-Id": 0, 11:21:45 "fiber-type": "smf", 11:21:45 "SRLG-length": 100000, 11:21:45 "pmd": 0.5}]}} 11:21:45 > response = test_utils.add_oms_attr_request("ROADMC01-DEG2-DEG2-TTP-TXRXtoROADMA01-DEG1-DEG1-TTP-TXRX", 11:21:45 data) 11:21:45 11:21:45 transportpce_tests/1.2.1/test06_end2end.py:173: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 transportpce_tests/common/test_utils.py:561: in add_oms_attr_request 11:21:45 response = put_request(url2.format('{}', network, link), oms_attr) 11:21:45 transportpce_tests/common/test_utils.py:124: in put_request 11:21:45 return requests.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:45 return session.request(method=method, url=url, **kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:45 resp = self.send(prep, **send_kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:45 r = adapter.send(request, **kwargs) 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 except (ProtocolError, OSError) as err: 11:21:45 raise ConnectionError(err, request=request) 11:21:45 11:21:45 except MaxRetryError as e: 11:21:45 if isinstance(e.reason, ConnectTimeoutError): 11:21:45 # TODO: Remove this in 3.0.0: see #2811 11:21:45 if not isinstance(e.reason, NewConnectionError): 11:21:45 raise ConnectTimeout(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, ResponseError): 11:21:45 raise RetryError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _ProxyError): 11:21:45 raise ProxyError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _SSLError): 11:21:45 # This branch is for urllib3 v1.22 and later. 11:21:45 raise SSLError(e, request=request) 11:21:45 11:21:45 > raise ConnectionError(e, request=request) 11:21:45 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/ietf-network-topology:link=ROADMC01-DEG2-DEG2-TTP-TXRXtoROADMA01-DEG1-DEG1-TTP-TXRX/org-openroadm-network-topology:OMS-attributes/span (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:45 ----------------------------- Captured stdout call ----------------------------- 11:21:45 execution of test_10_add_omsAttributes_ROADMC_ROADMA 11:21:45 _____________ TransportPCEFulltesting.test_11_create_eth_service1 ______________ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 > sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:45 raise err 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:45 socket_options = [(6, 1, 1)] 11:21:45 11:21:45 def create_connection( 11:21:45 address: tuple[str, int], 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 source_address: tuple[str, int] | None = None, 11:21:45 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:45 ) -> socket.socket: 11:21:45 """Connect to *address* and return the socket object. 11:21:45 11:21:45 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:45 port)``) and return the socket object. Passing the optional 11:21:45 *timeout* parameter will set the timeout on the socket instance 11:21:45 before attempting to connect. If no *timeout* is supplied, the 11:21:45 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:45 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:45 for the socket to bind as a source address before making the connection. 11:21:45 An host of '' or port 0 tells the OS to use the default. 11:21:45 """ 11:21:45 11:21:45 host, port = address 11:21:45 if host.startswith("["): 11:21:45 host = host.strip("[]") 11:21:45 err = None 11:21:45 11:21:45 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:45 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:45 # The original create_connection function always returns all records. 11:21:45 family = allowed_gai_family() 11:21:45 11:21:45 try: 11:21:45 host.encode("idna") 11:21:45 except UnicodeError: 11:21:45 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:45 11:21:45 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:45 af, socktype, proto, canonname, sa = res 11:21:45 sock = None 11:21:45 try: 11:21:45 sock = socket.socket(af, socktype, proto) 11:21:45 11:21:45 # If provided, set socket level options before connecting. 11:21:45 _set_socket_options(sock, socket_options) 11:21:45 11:21:45 if timeout is not _DEFAULT_TIMEOUT: 11:21:45 sock.settimeout(timeout) 11:21:45 if source_address: 11:21:45 sock.bind(source_address) 11:21:45 > sock.connect(sa) 11:21:45 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 method = 'POST', url = '/rests/operations/org-openroadm-service:service-create' 11:21:45 body = '{"input": {"sdnc-request-header": {"request-id": "e3028bae-a90f-4ddd-a83f-cf224eba0e58", "rpc-action": "service-creat...-direction": [{"index": 0}], "optic-type": "gray"}, "due-date": "2016-11-28T00:00:01Z", "operator-contact": "pw1234"}}' 11:21:45 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '784', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:45 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 redirect = False, assert_same_host = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:45 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:45 decode_content = False, response_kw = {} 11:21:45 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/org-openroadm-service:service-create', query=None, fragment=None) 11:21:45 destination_scheme = None, conn = None, release_this_conn = True 11:21:45 http_tunnel_required = False, err = None, clean_exit = False 11:21:45 11:21:45 def urlopen( # type: ignore[override] 11:21:45 self, 11:21:45 method: str, 11:21:45 url: str, 11:21:45 body: _TYPE_BODY | None = None, 11:21:45 headers: typing.Mapping[str, str] | None = None, 11:21:45 retries: Retry | bool | int | None = None, 11:21:45 redirect: bool = True, 11:21:45 assert_same_host: bool = True, 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 pool_timeout: int | None = None, 11:21:45 release_conn: bool | None = None, 11:21:45 chunked: bool = False, 11:21:45 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:45 preload_content: bool = True, 11:21:45 decode_content: bool = True, 11:21:45 **response_kw: typing.Any, 11:21:45 ) -> BaseHTTPResponse: 11:21:45 """ 11:21:45 Get a connection from the pool and perform an HTTP request. This is the 11:21:45 lowest level call for making a request, so you'll need to specify all 11:21:45 the raw details. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 More commonly, it's appropriate to use a convenience method 11:21:45 such as :meth:`request`. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 `release_conn` will only behave as expected if 11:21:45 `preload_content=False` because we want to make 11:21:45 `preload_content=False` the default behaviour someday soon without 11:21:45 breaking backwards compatibility. 11:21:45 11:21:45 :param method: 11:21:45 HTTP request method (such as GET, POST, PUT, etc.) 11:21:45 11:21:45 :param url: 11:21:45 The URL to perform the request on. 11:21:45 11:21:45 :param body: 11:21:45 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:45 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:45 11:21:45 :param headers: 11:21:45 Dictionary of custom headers to send, such as User-Agent, 11:21:45 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:45 these headers completely replace any pool-specific headers. 11:21:45 11:21:45 :param retries: 11:21:45 Configure the number of retries to allow before raising a 11:21:45 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:45 11:21:45 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:45 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:45 over different types of retries. 11:21:45 Pass an integer number to retry connection errors that many times, 11:21:45 but no other types of errors. Pass zero to never retry. 11:21:45 11:21:45 If ``False``, then retries are disabled and any exception is raised 11:21:45 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:45 the redirect response will be returned. 11:21:45 11:21:45 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:45 11:21:45 :param redirect: 11:21:45 If True, automatically handle redirects (status codes 301, 302, 11:21:45 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:45 will disable redirect, too. 11:21:45 11:21:45 :param assert_same_host: 11:21:45 If ``True``, will make sure that the host of the pool requests is 11:21:45 consistent else will raise HostChangedError. When ``False``, you can 11:21:45 use the pool on an HTTP proxy and request foreign hosts. 11:21:45 11:21:45 :param timeout: 11:21:45 If specified, overrides the default timeout for this one 11:21:45 request. It may be a float (in seconds) or an instance of 11:21:45 :class:`urllib3.util.Timeout`. 11:21:45 11:21:45 :param pool_timeout: 11:21:45 If set and the pool is set to block=True, then this method will 11:21:45 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:45 connection is available within the time period. 11:21:45 11:21:45 :param bool preload_content: 11:21:45 If True, the response's body will be preloaded into memory. 11:21:45 11:21:45 :param bool decode_content: 11:21:45 If True, will attempt to decode the body based on the 11:21:45 'content-encoding' header. 11:21:45 11:21:45 :param release_conn: 11:21:45 If False, then the urlopen call will not release the connection 11:21:45 back into the pool once a response is received (but will release if 11:21:45 you read the entire contents of the response such as when 11:21:45 `preload_content=True`). This is useful if you're not preloading 11:21:45 the response's content immediately. You will need to call 11:21:45 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:45 back into the pool. If None, it takes the value of ``preload_content`` 11:21:45 which defaults to ``True``. 11:21:45 11:21:45 :param bool chunked: 11:21:45 If True, urllib3 will send the body using chunked transfer 11:21:45 encoding. Otherwise, urllib3 will send the body using the standard 11:21:45 content-length form. Defaults to False. 11:21:45 11:21:45 :param int body_pos: 11:21:45 Position to seek to in file-like body in the event of a retry or 11:21:45 redirect. Typically this won't need to be set because urllib3 will 11:21:45 auto-populate the value when needed. 11:21:45 """ 11:21:45 parsed_url = parse_url(url) 11:21:45 destination_scheme = parsed_url.scheme 11:21:45 11:21:45 if headers is None: 11:21:45 headers = self.headers 11:21:45 11:21:45 if not isinstance(retries, Retry): 11:21:45 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:45 11:21:45 if release_conn is None: 11:21:45 release_conn = preload_content 11:21:45 11:21:45 # Check host 11:21:45 if assert_same_host and not self.is_same_host(url): 11:21:45 raise HostChangedError(self, url, retries) 11:21:45 11:21:45 # Ensure that the URL we're connecting to is properly encoded 11:21:45 if url.startswith("/"): 11:21:45 url = to_str(_encode_target(url)) 11:21:45 else: 11:21:45 url = to_str(parsed_url.url) 11:21:45 11:21:45 conn = None 11:21:45 11:21:45 # Track whether `conn` needs to be released before 11:21:45 # returning/raising/recursing. Update this variable if necessary, and 11:21:45 # leave `release_conn` constant throughout the function. That way, if 11:21:45 # the function recurses, the original value of `release_conn` will be 11:21:45 # passed down into the recursive call, and its value will be respected. 11:21:45 # 11:21:45 # See issue #651 [1] for details. 11:21:45 # 11:21:45 # [1] 11:21:45 release_this_conn = release_conn 11:21:45 11:21:45 http_tunnel_required = connection_requires_http_tunnel( 11:21:45 self.proxy, self.proxy_config, destination_scheme 11:21:45 ) 11:21:45 11:21:45 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:45 # have to copy the headers dict so we can safely change it without those 11:21:45 # changes being reflected in anyone else's copy. 11:21:45 if not http_tunnel_required: 11:21:45 headers = headers.copy() # type: ignore[attr-defined] 11:21:45 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:45 11:21:45 # Must keep the exception bound to a separate variable or else Python 3 11:21:45 # complains about UnboundLocalError. 11:21:45 err = None 11:21:45 11:21:45 # Keep track of whether we cleanly exited the except block. This 11:21:45 # ensures we do proper cleanup in finally. 11:21:45 clean_exit = False 11:21:45 11:21:45 # Rewind body position, if needed. Record current position 11:21:45 # for future rewinds in the event of a redirect/retry. 11:21:45 body_pos = set_file_position(body, body_pos) 11:21:45 11:21:45 try: 11:21:45 # Request a connection from the queue. 11:21:45 timeout_obj = self._get_timeout(timeout) 11:21:45 conn = self._get_conn(timeout=pool_timeout) 11:21:45 11:21:45 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:45 11:21:45 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:45 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:45 try: 11:21:45 self._prepare_proxy(conn) 11:21:45 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:45 self._raise_timeout( 11:21:45 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:45 ) 11:21:45 raise 11:21:45 11:21:45 # If we're going to release the connection in ``finally:``, then 11:21:45 # the response doesn't need to know about the connection. Otherwise 11:21:45 # it will also try to release it and we'll have a double-release 11:21:45 # mess. 11:21:45 response_conn = conn if not release_conn else None 11:21:45 11:21:45 # Make the request on the HTTPConnection object 11:21:45 > response = self._make_request( 11:21:45 conn, 11:21:45 method, 11:21:45 url, 11:21:45 timeout=timeout_obj, 11:21:45 body=body, 11:21:45 headers=headers, 11:21:45 chunked=chunked, 11:21:45 retries=retries, 11:21:45 response_conn=response_conn, 11:21:45 preload_content=preload_content, 11:21:45 decode_content=decode_content, 11:21:45 **response_kw, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:45 conn.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:45 self.endheaders() 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:45 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:45 self.send(msg) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:45 self.connect() 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:45 self.sock = self._new_conn() 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 except socket.gaierror as e: 11:21:45 raise NameResolutionError(self.host, self, e) from e 11:21:45 except SocketTimeout as e: 11:21:45 raise ConnectTimeoutError( 11:21:45 self, 11:21:45 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:45 ) from e 11:21:45 11:21:45 except OSError as e: 11:21:45 > raise NewConnectionError( 11:21:45 self, f"Failed to establish a new connection: {e}" 11:21:45 ) from e 11:21:45 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 > resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:45 retries = retries.increment( 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 method = 'POST', url = '/rests/operations/org-openroadm-service:service-create' 11:21:45 response = None 11:21:45 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:45 _pool = 11:21:45 _stacktrace = 11:21:45 11:21:45 def increment( 11:21:45 self, 11:21:45 method: str | None = None, 11:21:45 url: str | None = None, 11:21:45 response: BaseHTTPResponse | None = None, 11:21:45 error: Exception | None = None, 11:21:45 _pool: ConnectionPool | None = None, 11:21:45 _stacktrace: TracebackType | None = None, 11:21:45 ) -> Self: 11:21:45 """Return a new Retry object with incremented retry counters. 11:21:45 11:21:45 :param response: A response object, or None, if the server did not 11:21:45 return a response. 11:21:45 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:45 :param Exception error: An error encountered during the request, or 11:21:45 None if the response was received successfully. 11:21:45 11:21:45 :return: A new ``Retry`` object. 11:21:45 """ 11:21:45 if self.total is False and error: 11:21:45 # Disabled, indicate to re-raise the error. 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 11:21:45 total = self.total 11:21:45 if total is not None: 11:21:45 total -= 1 11:21:45 11:21:45 connect = self.connect 11:21:45 read = self.read 11:21:45 redirect = self.redirect 11:21:45 status_count = self.status 11:21:45 other = self.other 11:21:45 cause = "unknown" 11:21:45 status = None 11:21:45 redirect_location = None 11:21:45 11:21:45 if error and self._is_connection_error(error): 11:21:45 # Connect retry? 11:21:45 if connect is False: 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif connect is not None: 11:21:45 connect -= 1 11:21:45 11:21:45 elif error and self._is_read_error(error): 11:21:45 # Read retry? 11:21:45 if read is False or method is None or not self._is_method_retryable(method): 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif read is not None: 11:21:45 read -= 1 11:21:45 11:21:45 elif error: 11:21:45 # Other retry? 11:21:45 if other is not None: 11:21:45 other -= 1 11:21:45 11:21:45 elif response and response.get_redirect_location(): 11:21:45 # Redirect retry? 11:21:45 if redirect is not None: 11:21:45 redirect -= 1 11:21:45 cause = "too many redirects" 11:21:45 response_redirect_location = response.get_redirect_location() 11:21:45 if response_redirect_location: 11:21:45 redirect_location = response_redirect_location 11:21:45 status = response.status 11:21:45 11:21:45 else: 11:21:45 # Incrementing because of a server error like a 500 in 11:21:45 # status_forcelist and the given method is in the allowed_methods 11:21:45 cause = ResponseError.GENERIC_ERROR 11:21:45 if response and response.status: 11:21:45 if status_count is not None: 11:21:45 status_count -= 1 11:21:45 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:45 status = response.status 11:21:45 11:21:45 history = self.history + ( 11:21:45 RequestHistory(method, url, error, status, redirect_location), 11:21:45 ) 11:21:45 11:21:45 new_retry = self.new( 11:21:45 total=total, 11:21:45 connect=connect, 11:21:45 read=read, 11:21:45 redirect=redirect, 11:21:45 status=status_count, 11:21:45 other=other, 11:21:45 history=history, 11:21:45 ) 11:21:45 11:21:45 if new_retry.is_exhausted(): 11:21:45 reason = error or ResponseError(cause) 11:21:45 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:45 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/org-openroadm-service:service-create (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:45 11:21:45 During handling of the above exception, another exception occurred: 11:21:45 11:21:45 self = 11:21:45 11:21:45 def test_11_create_eth_service1(self): 11:21:45 self.cr_serv_input_data["service-name"] = "service1" 11:21:45 > response = test_utils.transportpce_api_rpc_request( 11:21:45 'org-openroadm-service', 'service-create', 11:21:45 self.cr_serv_input_data) 11:21:45 11:21:45 transportpce_tests/1.2.1/test06_end2end.py:181: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 transportpce_tests/common/test_utils.py:687: in transportpce_api_rpc_request 11:21:45 response = post_request(url, data) 11:21:45 transportpce_tests/common/test_utils.py:142: in post_request 11:21:45 return requests.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:45 return session.request(method=method, url=url, **kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:45 resp = self.send(prep, **send_kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:45 r = adapter.send(request, **kwargs) 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 except (ProtocolError, OSError) as err: 11:21:45 raise ConnectionError(err, request=request) 11:21:45 11:21:45 except MaxRetryError as e: 11:21:45 if isinstance(e.reason, ConnectTimeoutError): 11:21:45 # TODO: Remove this in 3.0.0: see #2811 11:21:45 if not isinstance(e.reason, NewConnectionError): 11:21:45 raise ConnectTimeout(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, ResponseError): 11:21:45 raise RetryError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _ProxyError): 11:21:45 raise ProxyError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _SSLError): 11:21:45 # This branch is for urllib3 v1.22 and later. 11:21:45 raise SSLError(e, request=request) 11:21:45 11:21:45 > raise ConnectionError(e, request=request) 11:21:45 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/org-openroadm-service:service-create (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:45 ----------------------------- Captured stdout call ----------------------------- 11:21:45 execution of test_11_create_eth_service1 11:21:45 _______________ TransportPCEFulltesting.test_12_get_eth_service1 _______________ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 > sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:45 raise err 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:45 socket_options = [(6, 1, 1)] 11:21:45 11:21:45 def create_connection( 11:21:45 address: tuple[str, int], 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 source_address: tuple[str, int] | None = None, 11:21:45 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:45 ) -> socket.socket: 11:21:45 """Connect to *address* and return the socket object. 11:21:45 11:21:45 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:45 port)``) and return the socket object. Passing the optional 11:21:45 *timeout* parameter will set the timeout on the socket instance 11:21:45 before attempting to connect. If no *timeout* is supplied, the 11:21:45 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:45 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:45 for the socket to bind as a source address before making the connection. 11:21:45 An host of '' or port 0 tells the OS to use the default. 11:21:45 """ 11:21:45 11:21:45 host, port = address 11:21:45 if host.startswith("["): 11:21:45 host = host.strip("[]") 11:21:45 err = None 11:21:45 11:21:45 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:45 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:45 # The original create_connection function always returns all records. 11:21:45 family = allowed_gai_family() 11:21:45 11:21:45 try: 11:21:45 host.encode("idna") 11:21:45 except UnicodeError: 11:21:45 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:45 11:21:45 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:45 af, socktype, proto, canonname, sa = res 11:21:45 sock = None 11:21:45 try: 11:21:45 sock = socket.socket(af, socktype, proto) 11:21:45 11:21:45 # If provided, set socket level options before connecting. 11:21:45 _set_socket_options(sock, socket_options) 11:21:45 11:21:45 if timeout is not _DEFAULT_TIMEOUT: 11:21:45 sock.settimeout(timeout) 11:21:45 if source_address: 11:21:45 sock.bind(source_address) 11:21:45 > sock.connect(sa) 11:21:45 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 method = 'GET' 11:21:45 url = '/rests/data/org-openroadm-service:service-list/services=service1?content=nonconfig' 11:21:45 body = None 11:21:45 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:45 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 redirect = False, assert_same_host = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:45 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:45 decode_content = False, response_kw = {} 11:21:45 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/org-openroadm-service:service-list/services=service1', query='content=nonconfig', fragment=None) 11:21:45 destination_scheme = None, conn = None, release_this_conn = True 11:21:45 http_tunnel_required = False, err = None, clean_exit = False 11:21:45 11:21:45 def urlopen( # type: ignore[override] 11:21:45 self, 11:21:45 method: str, 11:21:45 url: str, 11:21:45 body: _TYPE_BODY | None = None, 11:21:45 headers: typing.Mapping[str, str] | None = None, 11:21:45 retries: Retry | bool | int | None = None, 11:21:45 redirect: bool = True, 11:21:45 assert_same_host: bool = True, 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 pool_timeout: int | None = None, 11:21:45 release_conn: bool | None = None, 11:21:45 chunked: bool = False, 11:21:45 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:45 preload_content: bool = True, 11:21:45 decode_content: bool = True, 11:21:45 **response_kw: typing.Any, 11:21:45 ) -> BaseHTTPResponse: 11:21:45 """ 11:21:45 Get a connection from the pool and perform an HTTP request. This is the 11:21:45 lowest level call for making a request, so you'll need to specify all 11:21:45 the raw details. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 More commonly, it's appropriate to use a convenience method 11:21:45 such as :meth:`request`. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 `release_conn` will only behave as expected if 11:21:45 `preload_content=False` because we want to make 11:21:45 `preload_content=False` the default behaviour someday soon without 11:21:45 breaking backwards compatibility. 11:21:45 11:21:45 :param method: 11:21:45 HTTP request method (such as GET, POST, PUT, etc.) 11:21:45 11:21:45 :param url: 11:21:45 The URL to perform the request on. 11:21:45 11:21:45 :param body: 11:21:45 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:45 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:45 11:21:45 :param headers: 11:21:45 Dictionary of custom headers to send, such as User-Agent, 11:21:45 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:45 these headers completely replace any pool-specific headers. 11:21:45 11:21:45 :param retries: 11:21:45 Configure the number of retries to allow before raising a 11:21:45 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:45 11:21:45 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:45 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:45 over different types of retries. 11:21:45 Pass an integer number to retry connection errors that many times, 11:21:45 but no other types of errors. Pass zero to never retry. 11:21:45 11:21:45 If ``False``, then retries are disabled and any exception is raised 11:21:45 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:45 the redirect response will be returned. 11:21:45 11:21:45 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:45 11:21:45 :param redirect: 11:21:45 If True, automatically handle redirects (status codes 301, 302, 11:21:45 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:45 will disable redirect, too. 11:21:45 11:21:45 :param assert_same_host: 11:21:45 If ``True``, will make sure that the host of the pool requests is 11:21:45 consistent else will raise HostChangedError. When ``False``, you can 11:21:45 use the pool on an HTTP proxy and request foreign hosts. 11:21:45 11:21:45 :param timeout: 11:21:45 If specified, overrides the default timeout for this one 11:21:45 request. It may be a float (in seconds) or an instance of 11:21:45 :class:`urllib3.util.Timeout`. 11:21:45 11:21:45 :param pool_timeout: 11:21:45 If set and the pool is set to block=True, then this method will 11:21:45 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:45 connection is available within the time period. 11:21:45 11:21:45 :param bool preload_content: 11:21:45 If True, the response's body will be preloaded into memory. 11:21:45 11:21:45 :param bool decode_content: 11:21:45 If True, will attempt to decode the body based on the 11:21:45 'content-encoding' header. 11:21:45 11:21:45 :param release_conn: 11:21:45 If False, then the urlopen call will not release the connection 11:21:45 back into the pool once a response is received (but will release if 11:21:45 you read the entire contents of the response such as when 11:21:45 `preload_content=True`). This is useful if you're not preloading 11:21:45 the response's content immediately. You will need to call 11:21:45 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:45 back into the pool. If None, it takes the value of ``preload_content`` 11:21:45 which defaults to ``True``. 11:21:45 11:21:45 :param bool chunked: 11:21:45 If True, urllib3 will send the body using chunked transfer 11:21:45 encoding. Otherwise, urllib3 will send the body using the standard 11:21:45 content-length form. Defaults to False. 11:21:45 11:21:45 :param int body_pos: 11:21:45 Position to seek to in file-like body in the event of a retry or 11:21:45 redirect. Typically this won't need to be set because urllib3 will 11:21:45 auto-populate the value when needed. 11:21:45 """ 11:21:45 parsed_url = parse_url(url) 11:21:45 destination_scheme = parsed_url.scheme 11:21:45 11:21:45 if headers is None: 11:21:45 headers = self.headers 11:21:45 11:21:45 if not isinstance(retries, Retry): 11:21:45 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:45 11:21:45 if release_conn is None: 11:21:45 release_conn = preload_content 11:21:45 11:21:45 # Check host 11:21:45 if assert_same_host and not self.is_same_host(url): 11:21:45 raise HostChangedError(self, url, retries) 11:21:45 11:21:45 # Ensure that the URL we're connecting to is properly encoded 11:21:45 if url.startswith("/"): 11:21:45 url = to_str(_encode_target(url)) 11:21:45 else: 11:21:45 url = to_str(parsed_url.url) 11:21:45 11:21:45 conn = None 11:21:45 11:21:45 # Track whether `conn` needs to be released before 11:21:45 # returning/raising/recursing. Update this variable if necessary, and 11:21:45 # leave `release_conn` constant throughout the function. That way, if 11:21:45 # the function recurses, the original value of `release_conn` will be 11:21:45 # passed down into the recursive call, and its value will be respected. 11:21:45 # 11:21:45 # See issue #651 [1] for details. 11:21:45 # 11:21:45 # [1] 11:21:45 release_this_conn = release_conn 11:21:45 11:21:45 http_tunnel_required = connection_requires_http_tunnel( 11:21:45 self.proxy, self.proxy_config, destination_scheme 11:21:45 ) 11:21:45 11:21:45 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:45 # have to copy the headers dict so we can safely change it without those 11:21:45 # changes being reflected in anyone else's copy. 11:21:45 if not http_tunnel_required: 11:21:45 headers = headers.copy() # type: ignore[attr-defined] 11:21:45 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:45 11:21:45 # Must keep the exception bound to a separate variable or else Python 3 11:21:45 # complains about UnboundLocalError. 11:21:45 err = None 11:21:45 11:21:45 # Keep track of whether we cleanly exited the except block. This 11:21:45 # ensures we do proper cleanup in finally. 11:21:45 clean_exit = False 11:21:45 11:21:45 # Rewind body position, if needed. Record current position 11:21:45 # for future rewinds in the event of a redirect/retry. 11:21:45 body_pos = set_file_position(body, body_pos) 11:21:45 11:21:45 try: 11:21:45 # Request a connection from the queue. 11:21:45 timeout_obj = self._get_timeout(timeout) 11:21:45 conn = self._get_conn(timeout=pool_timeout) 11:21:45 11:21:45 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:45 11:21:45 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:45 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:45 try: 11:21:45 self._prepare_proxy(conn) 11:21:45 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:45 self._raise_timeout( 11:21:45 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:45 ) 11:21:45 raise 11:21:45 11:21:45 # If we're going to release the connection in ``finally:``, then 11:21:45 # the response doesn't need to know about the connection. Otherwise 11:21:45 # it will also try to release it and we'll have a double-release 11:21:45 # mess. 11:21:45 response_conn = conn if not release_conn else None 11:21:45 11:21:45 # Make the request on the HTTPConnection object 11:21:45 > response = self._make_request( 11:21:45 conn, 11:21:45 method, 11:21:45 url, 11:21:45 timeout=timeout_obj, 11:21:45 body=body, 11:21:45 headers=headers, 11:21:45 chunked=chunked, 11:21:45 retries=retries, 11:21:45 response_conn=response_conn, 11:21:45 preload_content=preload_content, 11:21:45 decode_content=decode_content, 11:21:45 **response_kw, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:45 conn.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:45 self.endheaders() 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:45 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:45 self.send(msg) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:45 self.connect() 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:45 self.sock = self._new_conn() 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 except socket.gaierror as e: 11:21:45 raise NameResolutionError(self.host, self, e) from e 11:21:45 except SocketTimeout as e: 11:21:45 raise ConnectTimeoutError( 11:21:45 self, 11:21:45 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:45 ) from e 11:21:45 11:21:45 except OSError as e: 11:21:45 > raise NewConnectionError( 11:21:45 self, f"Failed to establish a new connection: {e}" 11:21:45 ) from e 11:21:45 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 > resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:45 retries = retries.increment( 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 method = 'GET' 11:21:45 url = '/rests/data/org-openroadm-service:service-list/services=service1?content=nonconfig' 11:21:45 response = None 11:21:45 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:45 _pool = 11:21:45 _stacktrace = 11:21:45 11:21:45 def increment( 11:21:45 self, 11:21:45 method: str | None = None, 11:21:45 url: str | None = None, 11:21:45 response: BaseHTTPResponse | None = None, 11:21:45 error: Exception | None = None, 11:21:45 _pool: ConnectionPool | None = None, 11:21:45 _stacktrace: TracebackType | None = None, 11:21:45 ) -> Self: 11:21:45 """Return a new Retry object with incremented retry counters. 11:21:45 11:21:45 :param response: A response object, or None, if the server did not 11:21:45 return a response. 11:21:45 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:45 :param Exception error: An error encountered during the request, or 11:21:45 None if the response was received successfully. 11:21:45 11:21:45 :return: A new ``Retry`` object. 11:21:45 """ 11:21:45 if self.total is False and error: 11:21:45 # Disabled, indicate to re-raise the error. 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 11:21:45 total = self.total 11:21:45 if total is not None: 11:21:45 total -= 1 11:21:45 11:21:45 connect = self.connect 11:21:45 read = self.read 11:21:45 redirect = self.redirect 11:21:45 status_count = self.status 11:21:45 other = self.other 11:21:45 cause = "unknown" 11:21:45 status = None 11:21:45 redirect_location = None 11:21:45 11:21:45 if error and self._is_connection_error(error): 11:21:45 # Connect retry? 11:21:45 if connect is False: 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif connect is not None: 11:21:45 connect -= 1 11:21:45 11:21:45 elif error and self._is_read_error(error): 11:21:45 # Read retry? 11:21:45 if read is False or method is None or not self._is_method_retryable(method): 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif read is not None: 11:21:45 read -= 1 11:21:45 11:21:45 elif error: 11:21:45 # Other retry? 11:21:45 if other is not None: 11:21:45 other -= 1 11:21:45 11:21:45 elif response and response.get_redirect_location(): 11:21:45 # Redirect retry? 11:21:45 if redirect is not None: 11:21:45 redirect -= 1 11:21:45 cause = "too many redirects" 11:21:45 response_redirect_location = response.get_redirect_location() 11:21:45 if response_redirect_location: 11:21:45 redirect_location = response_redirect_location 11:21:45 status = response.status 11:21:45 11:21:45 else: 11:21:45 # Incrementing because of a server error like a 500 in 11:21:45 # status_forcelist and the given method is in the allowed_methods 11:21:45 cause = ResponseError.GENERIC_ERROR 11:21:45 if response and response.status: 11:21:45 if status_count is not None: 11:21:45 status_count -= 1 11:21:45 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:45 status = response.status 11:21:45 11:21:45 history = self.history + ( 11:21:45 RequestHistory(method, url, error, status, redirect_location), 11:21:45 ) 11:21:45 11:21:45 new_retry = self.new( 11:21:45 total=total, 11:21:45 connect=connect, 11:21:45 read=read, 11:21:45 redirect=redirect, 11:21:45 status=status_count, 11:21:45 other=other, 11:21:45 history=history, 11:21:45 ) 11:21:45 11:21:45 if new_retry.is_exhausted(): 11:21:45 reason = error or ResponseError(cause) 11:21:45 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:45 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/org-openroadm-service:service-list/services=service1?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:45 11:21:45 During handling of the above exception, another exception occurred: 11:21:45 11:21:45 self = 11:21:45 11:21:45 def test_12_get_eth_service1(self): 11:21:45 > response = test_utils.get_ordm_serv_list_attr_request("services", "service1") 11:21:45 11:21:45 transportpce_tests/1.2.1/test06_end2end.py:190: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 transportpce_tests/common/test_utils.py:632: in get_ordm_serv_list_attr_request 11:21:45 response = get_request(url[RESTCONF_VERSION].format(*format_args)) 11:21:45 transportpce_tests/common/test_utils.py:116: in get_request 11:21:45 return requests.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:45 return session.request(method=method, url=url, **kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:45 resp = self.send(prep, **send_kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:45 r = adapter.send(request, **kwargs) 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 except (ProtocolError, OSError) as err: 11:21:45 raise ConnectionError(err, request=request) 11:21:45 11:21:45 except MaxRetryError as e: 11:21:45 if isinstance(e.reason, ConnectTimeoutError): 11:21:45 # TODO: Remove this in 3.0.0: see #2811 11:21:45 if not isinstance(e.reason, NewConnectionError): 11:21:45 raise ConnectTimeout(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, ResponseError): 11:21:45 raise RetryError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _ProxyError): 11:21:45 raise ProxyError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _SSLError): 11:21:45 # This branch is for urllib3 v1.22 and later. 11:21:45 raise SSLError(e, request=request) 11:21:45 11:21:45 > raise ConnectionError(e, request=request) 11:21:45 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/org-openroadm-service:service-list/services=service1?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:45 ----------------------------- Captured stdout call ----------------------------- 11:21:45 execution of test_12_get_eth_service1 11:21:45 _______________ TransportPCEFulltesting.test_13_check_xc1_ROADMA _______________ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 > sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:45 raise err 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:45 socket_options = [(6, 1, 1)] 11:21:45 11:21:45 def create_connection( 11:21:45 address: tuple[str, int], 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 source_address: tuple[str, int] | None = None, 11:21:45 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:45 ) -> socket.socket: 11:21:45 """Connect to *address* and return the socket object. 11:21:45 11:21:45 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:45 port)``) and return the socket object. Passing the optional 11:21:45 *timeout* parameter will set the timeout on the socket instance 11:21:45 before attempting to connect. If no *timeout* is supplied, the 11:21:45 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:45 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:45 for the socket to bind as a source address before making the connection. 11:21:45 An host of '' or port 0 tells the OS to use the default. 11:21:45 """ 11:21:45 11:21:45 host, port = address 11:21:45 if host.startswith("["): 11:21:45 host = host.strip("[]") 11:21:45 err = None 11:21:45 11:21:45 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:45 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:45 # The original create_connection function always returns all records. 11:21:45 family = allowed_gai_family() 11:21:45 11:21:45 try: 11:21:45 host.encode("idna") 11:21:45 except UnicodeError: 11:21:45 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:45 11:21:45 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:45 af, socktype, proto, canonname, sa = res 11:21:45 sock = None 11:21:45 try: 11:21:45 sock = socket.socket(af, socktype, proto) 11:21:45 11:21:45 # If provided, set socket level options before connecting. 11:21:45 _set_socket_options(sock, socket_options) 11:21:45 11:21:45 if timeout is not _DEFAULT_TIMEOUT: 11:21:45 sock.settimeout(timeout) 11:21:45 if source_address: 11:21:45 sock.bind(source_address) 11:21:45 > sock.connect(sa) 11:21:45 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 method = 'GET' 11:21:45 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01/yang-ext:mount/org-openroadm-device:org-openroadm-device/roadm-connections=SRG1-PP1-TXRX-DEG1-TTP-TXRX-761:768?content=nonconfig' 11:21:45 body = None 11:21:45 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:45 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 redirect = False, assert_same_host = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:45 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:45 decode_content = False, response_kw = {} 11:21:45 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topolog...:org-openroadm-device/roadm-connections=SRG1-PP1-TXRX-DEG1-TTP-TXRX-761:768', query='content=nonconfig', fragment=None) 11:21:45 destination_scheme = None, conn = None, release_this_conn = True 11:21:45 http_tunnel_required = False, err = None, clean_exit = False 11:21:45 11:21:45 def urlopen( # type: ignore[override] 11:21:45 self, 11:21:45 method: str, 11:21:45 url: str, 11:21:45 body: _TYPE_BODY | None = None, 11:21:45 headers: typing.Mapping[str, str] | None = None, 11:21:45 retries: Retry | bool | int | None = None, 11:21:45 redirect: bool = True, 11:21:45 assert_same_host: bool = True, 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 pool_timeout: int | None = None, 11:21:45 release_conn: bool | None = None, 11:21:45 chunked: bool = False, 11:21:45 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:45 preload_content: bool = True, 11:21:45 decode_content: bool = True, 11:21:45 **response_kw: typing.Any, 11:21:45 ) -> BaseHTTPResponse: 11:21:45 """ 11:21:45 Get a connection from the pool and perform an HTTP request. This is the 11:21:45 lowest level call for making a request, so you'll need to specify all 11:21:45 the raw details. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 More commonly, it's appropriate to use a convenience method 11:21:45 such as :meth:`request`. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 `release_conn` will only behave as expected if 11:21:45 `preload_content=False` because we want to make 11:21:45 `preload_content=False` the default behaviour someday soon without 11:21:45 breaking backwards compatibility. 11:21:45 11:21:45 :param method: 11:21:45 HTTP request method (such as GET, POST, PUT, etc.) 11:21:45 11:21:45 :param url: 11:21:45 The URL to perform the request on. 11:21:45 11:21:45 :param body: 11:21:45 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:45 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:45 11:21:45 :param headers: 11:21:45 Dictionary of custom headers to send, such as User-Agent, 11:21:45 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:45 these headers completely replace any pool-specific headers. 11:21:45 11:21:45 :param retries: 11:21:45 Configure the number of retries to allow before raising a 11:21:45 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:45 11:21:45 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:45 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:45 over different types of retries. 11:21:45 Pass an integer number to retry connection errors that many times, 11:21:45 but no other types of errors. Pass zero to never retry. 11:21:45 11:21:45 If ``False``, then retries are disabled and any exception is raised 11:21:45 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:45 the redirect response will be returned. 11:21:45 11:21:45 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:45 11:21:45 :param redirect: 11:21:45 If True, automatically handle redirects (status codes 301, 302, 11:21:45 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:45 will disable redirect, too. 11:21:45 11:21:45 :param assert_same_host: 11:21:45 If ``True``, will make sure that the host of the pool requests is 11:21:45 consistent else will raise HostChangedError. When ``False``, you can 11:21:45 use the pool on an HTTP proxy and request foreign hosts. 11:21:45 11:21:45 :param timeout: 11:21:45 If specified, overrides the default timeout for this one 11:21:45 request. It may be a float (in seconds) or an instance of 11:21:45 :class:`urllib3.util.Timeout`. 11:21:45 11:21:45 :param pool_timeout: 11:21:45 If set and the pool is set to block=True, then this method will 11:21:45 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:45 connection is available within the time period. 11:21:45 11:21:45 :param bool preload_content: 11:21:45 If True, the response's body will be preloaded into memory. 11:21:45 11:21:45 :param bool decode_content: 11:21:45 If True, will attempt to decode the body based on the 11:21:45 'content-encoding' header. 11:21:45 11:21:45 :param release_conn: 11:21:45 If False, then the urlopen call will not release the connection 11:21:45 back into the pool once a response is received (but will release if 11:21:45 you read the entire contents of the response such as when 11:21:45 `preload_content=True`). This is useful if you're not preloading 11:21:45 the response's content immediately. You will need to call 11:21:45 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:45 back into the pool. If None, it takes the value of ``preload_content`` 11:21:45 which defaults to ``True``. 11:21:45 11:21:45 :param bool chunked: 11:21:45 If True, urllib3 will send the body using chunked transfer 11:21:45 encoding. Otherwise, urllib3 will send the body using the standard 11:21:45 content-length form. Defaults to False. 11:21:45 11:21:45 :param int body_pos: 11:21:45 Position to seek to in file-like body in the event of a retry or 11:21:45 redirect. Typically this won't need to be set because urllib3 will 11:21:45 auto-populate the value when needed. 11:21:45 """ 11:21:45 parsed_url = parse_url(url) 11:21:45 destination_scheme = parsed_url.scheme 11:21:45 11:21:45 if headers is None: 11:21:45 headers = self.headers 11:21:45 11:21:45 if not isinstance(retries, Retry): 11:21:45 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:45 11:21:45 if release_conn is None: 11:21:45 release_conn = preload_content 11:21:45 11:21:45 # Check host 11:21:45 if assert_same_host and not self.is_same_host(url): 11:21:45 raise HostChangedError(self, url, retries) 11:21:45 11:21:45 # Ensure that the URL we're connecting to is properly encoded 11:21:45 if url.startswith("/"): 11:21:45 url = to_str(_encode_target(url)) 11:21:45 else: 11:21:45 url = to_str(parsed_url.url) 11:21:45 11:21:45 conn = None 11:21:45 11:21:45 # Track whether `conn` needs to be released before 11:21:45 # returning/raising/recursing. Update this variable if necessary, and 11:21:45 # leave `release_conn` constant throughout the function. That way, if 11:21:45 # the function recurses, the original value of `release_conn` will be 11:21:45 # passed down into the recursive call, and its value will be respected. 11:21:45 # 11:21:45 # See issue #651 [1] for details. 11:21:45 # 11:21:45 # [1] 11:21:45 release_this_conn = release_conn 11:21:45 11:21:45 http_tunnel_required = connection_requires_http_tunnel( 11:21:45 self.proxy, self.proxy_config, destination_scheme 11:21:45 ) 11:21:45 11:21:45 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:45 # have to copy the headers dict so we can safely change it without those 11:21:45 # changes being reflected in anyone else's copy. 11:21:45 if not http_tunnel_required: 11:21:45 headers = headers.copy() # type: ignore[attr-defined] 11:21:45 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:45 11:21:45 # Must keep the exception bound to a separate variable or else Python 3 11:21:45 # complains about UnboundLocalError. 11:21:45 err = None 11:21:45 11:21:45 # Keep track of whether we cleanly exited the except block. This 11:21:45 # ensures we do proper cleanup in finally. 11:21:45 clean_exit = False 11:21:45 11:21:45 # Rewind body position, if needed. Record current position 11:21:45 # for future rewinds in the event of a redirect/retry. 11:21:45 body_pos = set_file_position(body, body_pos) 11:21:45 11:21:45 try: 11:21:45 # Request a connection from the queue. 11:21:45 timeout_obj = self._get_timeout(timeout) 11:21:45 conn = self._get_conn(timeout=pool_timeout) 11:21:45 11:21:45 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:45 11:21:45 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:45 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:45 try: 11:21:45 self._prepare_proxy(conn) 11:21:45 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:45 self._raise_timeout( 11:21:45 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:45 ) 11:21:45 raise 11:21:45 11:21:45 # If we're going to release the connection in ``finally:``, then 11:21:45 # the response doesn't need to know about the connection. Otherwise 11:21:45 # it will also try to release it and we'll have a double-release 11:21:45 # mess. 11:21:45 response_conn = conn if not release_conn else None 11:21:45 11:21:45 # Make the request on the HTTPConnection object 11:21:45 > response = self._make_request( 11:21:45 conn, 11:21:45 method, 11:21:45 url, 11:21:45 timeout=timeout_obj, 11:21:45 body=body, 11:21:45 headers=headers, 11:21:45 chunked=chunked, 11:21:45 retries=retries, 11:21:45 response_conn=response_conn, 11:21:45 preload_content=preload_content, 11:21:45 decode_content=decode_content, 11:21:45 **response_kw, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:45 conn.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:45 self.endheaders() 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:45 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:45 self.send(msg) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:45 self.connect() 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:45 self.sock = self._new_conn() 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 except socket.gaierror as e: 11:21:45 raise NameResolutionError(self.host, self, e) from e 11:21:45 except SocketTimeout as e: 11:21:45 raise ConnectTimeoutError( 11:21:45 self, 11:21:45 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:45 ) from e 11:21:45 11:21:45 except OSError as e: 11:21:45 > raise NewConnectionError( 11:21:45 self, f"Failed to establish a new connection: {e}" 11:21:45 ) from e 11:21:45 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 > resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:45 retries = retries.increment( 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 method = 'GET' 11:21:45 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01/yang-ext:mount/org-openroadm-device:org-openroadm-device/roadm-connections=SRG1-PP1-TXRX-DEG1-TTP-TXRX-761:768?content=nonconfig' 11:21:45 response = None 11:21:45 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:45 _pool = 11:21:45 _stacktrace = 11:21:45 11:21:45 def increment( 11:21:45 self, 11:21:45 method: str | None = None, 11:21:45 url: str | None = None, 11:21:45 response: BaseHTTPResponse | None = None, 11:21:45 error: Exception | None = None, 11:21:45 _pool: ConnectionPool | None = None, 11:21:45 _stacktrace: TracebackType | None = None, 11:21:45 ) -> Self: 11:21:45 """Return a new Retry object with incremented retry counters. 11:21:45 11:21:45 :param response: A response object, or None, if the server did not 11:21:45 return a response. 11:21:45 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:45 :param Exception error: An error encountered during the request, or 11:21:45 None if the response was received successfully. 11:21:45 11:21:45 :return: A new ``Retry`` object. 11:21:45 """ 11:21:45 if self.total is False and error: 11:21:45 # Disabled, indicate to re-raise the error. 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 11:21:45 total = self.total 11:21:45 if total is not None: 11:21:45 total -= 1 11:21:45 11:21:45 connect = self.connect 11:21:45 read = self.read 11:21:45 redirect = self.redirect 11:21:45 status_count = self.status 11:21:45 other = self.other 11:21:45 cause = "unknown" 11:21:45 status = None 11:21:45 redirect_location = None 11:21:45 11:21:45 if error and self._is_connection_error(error): 11:21:45 # Connect retry? 11:21:45 if connect is False: 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif connect is not None: 11:21:45 connect -= 1 11:21:45 11:21:45 elif error and self._is_read_error(error): 11:21:45 # Read retry? 11:21:45 if read is False or method is None or not self._is_method_retryable(method): 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif read is not None: 11:21:45 read -= 1 11:21:45 11:21:45 elif error: 11:21:45 # Other retry? 11:21:45 if other is not None: 11:21:45 other -= 1 11:21:45 11:21:45 elif response and response.get_redirect_location(): 11:21:45 # Redirect retry? 11:21:45 if redirect is not None: 11:21:45 redirect -= 1 11:21:45 cause = "too many redirects" 11:21:45 response_redirect_location = response.get_redirect_location() 11:21:45 if response_redirect_location: 11:21:45 redirect_location = response_redirect_location 11:21:45 status = response.status 11:21:45 11:21:45 else: 11:21:45 # Incrementing because of a server error like a 500 in 11:21:45 # status_forcelist and the given method is in the allowed_methods 11:21:45 cause = ResponseError.GENERIC_ERROR 11:21:45 if response and response.status: 11:21:45 if status_count is not None: 11:21:45 status_count -= 1 11:21:45 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:45 status = response.status 11:21:45 11:21:45 history = self.history + ( 11:21:45 RequestHistory(method, url, error, status, redirect_location), 11:21:45 ) 11:21:45 11:21:45 new_retry = self.new( 11:21:45 total=total, 11:21:45 connect=connect, 11:21:45 read=read, 11:21:45 redirect=redirect, 11:21:45 status=status_count, 11:21:45 other=other, 11:21:45 history=history, 11:21:45 ) 11:21:45 11:21:45 if new_retry.is_exhausted(): 11:21:45 reason = error or ResponseError(cause) 11:21:45 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:45 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01/yang-ext:mount/org-openroadm-device:org-openroadm-device/roadm-connections=SRG1-PP1-TXRX-DEG1-TTP-TXRX-761:768?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:45 11:21:45 During handling of the above exception, another exception occurred: 11:21:45 11:21:45 self = 11:21:45 11:21:45 def test_13_check_xc1_ROADMA(self): 11:21:45 > response = test_utils.check_node_attribute_request( 11:21:45 "ROADMA01", "roadm-connections", "SRG1-PP1-TXRX-DEG1-TTP-TXRX-761:768") 11:21:45 11:21:45 transportpce_tests/1.2.1/test06_end2end.py:199: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 transportpce_tests/common/test_utils.py:404: in check_node_attribute_request 11:21:45 response = get_request(url[RESTCONF_VERSION].format('{}', node, attribute, attribute_value)) 11:21:45 transportpce_tests/common/test_utils.py:116: in get_request 11:21:45 return requests.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:45 return session.request(method=method, url=url, **kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:45 resp = self.send(prep, **send_kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:45 r = adapter.send(request, **kwargs) 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 except (ProtocolError, OSError) as err: 11:21:45 raise ConnectionError(err, request=request) 11:21:45 11:21:45 except MaxRetryError as e: 11:21:45 if isinstance(e.reason, ConnectTimeoutError): 11:21:45 # TODO: Remove this in 3.0.0: see #2811 11:21:45 if not isinstance(e.reason, NewConnectionError): 11:21:45 raise ConnectTimeout(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, ResponseError): 11:21:45 raise RetryError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _ProxyError): 11:21:45 raise ProxyError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _SSLError): 11:21:45 # This branch is for urllib3 v1.22 and later. 11:21:45 raise SSLError(e, request=request) 11:21:45 11:21:45 > raise ConnectionError(e, request=request) 11:21:45 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01/yang-ext:mount/org-openroadm-device:org-openroadm-device/roadm-connections=SRG1-PP1-TXRX-DEG1-TTP-TXRX-761:768?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:45 ----------------------------- Captured stdout call ----------------------------- 11:21:45 execution of test_13_check_xc1_ROADMA 11:21:45 _______________ TransportPCEFulltesting.test_14_check_xc1_ROADMC _______________ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 > sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:45 raise err 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:45 socket_options = [(6, 1, 1)] 11:21:45 11:21:45 def create_connection( 11:21:45 address: tuple[str, int], 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 source_address: tuple[str, int] | None = None, 11:21:45 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:45 ) -> socket.socket: 11:21:45 """Connect to *address* and return the socket object. 11:21:45 11:21:45 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:45 port)``) and return the socket object. Passing the optional 11:21:45 *timeout* parameter will set the timeout on the socket instance 11:21:45 before attempting to connect. If no *timeout* is supplied, the 11:21:45 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:45 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:45 for the socket to bind as a source address before making the connection. 11:21:45 An host of '' or port 0 tells the OS to use the default. 11:21:45 """ 11:21:45 11:21:45 host, port = address 11:21:45 if host.startswith("["): 11:21:45 host = host.strip("[]") 11:21:45 err = None 11:21:45 11:21:45 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:45 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:45 # The original create_connection function always returns all records. 11:21:45 family = allowed_gai_family() 11:21:45 11:21:45 try: 11:21:45 host.encode("idna") 11:21:45 except UnicodeError: 11:21:45 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:45 11:21:45 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:45 af, socktype, proto, canonname, sa = res 11:21:45 sock = None 11:21:45 try: 11:21:45 sock = socket.socket(af, socktype, proto) 11:21:45 11:21:45 # If provided, set socket level options before connecting. 11:21:45 _set_socket_options(sock, socket_options) 11:21:45 11:21:45 if timeout is not _DEFAULT_TIMEOUT: 11:21:45 sock.settimeout(timeout) 11:21:45 if source_address: 11:21:45 sock.bind(source_address) 11:21:45 > sock.connect(sa) 11:21:45 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 method = 'GET' 11:21:45 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01/yang-ext:mount/org-openroadm-device:org-openroadm-device/roadm-connections=SRG1-PP1-TXRX-DEG2-TTP-TXRX-761:768?content=nonconfig' 11:21:45 body = None 11:21:45 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:45 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 redirect = False, assert_same_host = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:45 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:45 decode_content = False, response_kw = {} 11:21:45 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topolog...:org-openroadm-device/roadm-connections=SRG1-PP1-TXRX-DEG2-TTP-TXRX-761:768', query='content=nonconfig', fragment=None) 11:21:45 destination_scheme = None, conn = None, release_this_conn = True 11:21:45 http_tunnel_required = False, err = None, clean_exit = False 11:21:45 11:21:45 def urlopen( # type: ignore[override] 11:21:45 self, 11:21:45 method: str, 11:21:45 url: str, 11:21:45 body: _TYPE_BODY | None = None, 11:21:45 headers: typing.Mapping[str, str] | None = None, 11:21:45 retries: Retry | bool | int | None = None, 11:21:45 redirect: bool = True, 11:21:45 assert_same_host: bool = True, 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 pool_timeout: int | None = None, 11:21:45 release_conn: bool | None = None, 11:21:45 chunked: bool = False, 11:21:45 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:45 preload_content: bool = True, 11:21:45 decode_content: bool = True, 11:21:45 **response_kw: typing.Any, 11:21:45 ) -> BaseHTTPResponse: 11:21:45 """ 11:21:45 Get a connection from the pool and perform an HTTP request. This is the 11:21:45 lowest level call for making a request, so you'll need to specify all 11:21:45 the raw details. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 More commonly, it's appropriate to use a convenience method 11:21:45 such as :meth:`request`. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 `release_conn` will only behave as expected if 11:21:45 `preload_content=False` because we want to make 11:21:45 `preload_content=False` the default behaviour someday soon without 11:21:45 breaking backwards compatibility. 11:21:45 11:21:45 :param method: 11:21:45 HTTP request method (such as GET, POST, PUT, etc.) 11:21:45 11:21:45 :param url: 11:21:45 The URL to perform the request on. 11:21:45 11:21:45 :param body: 11:21:45 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:45 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:45 11:21:45 :param headers: 11:21:45 Dictionary of custom headers to send, such as User-Agent, 11:21:45 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:45 these headers completely replace any pool-specific headers. 11:21:45 11:21:45 :param retries: 11:21:45 Configure the number of retries to allow before raising a 11:21:45 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:45 11:21:45 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:45 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:45 over different types of retries. 11:21:45 Pass an integer number to retry connection errors that many times, 11:21:45 but no other types of errors. Pass zero to never retry. 11:21:45 11:21:45 If ``False``, then retries are disabled and any exception is raised 11:21:45 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:45 the redirect response will be returned. 11:21:45 11:21:45 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:45 11:21:45 :param redirect: 11:21:45 If True, automatically handle redirects (status codes 301, 302, 11:21:45 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:45 will disable redirect, too. 11:21:45 11:21:45 :param assert_same_host: 11:21:45 If ``True``, will make sure that the host of the pool requests is 11:21:45 consistent else will raise HostChangedError. When ``False``, you can 11:21:45 use the pool on an HTTP proxy and request foreign hosts. 11:21:45 11:21:45 :param timeout: 11:21:45 If specified, overrides the default timeout for this one 11:21:45 request. It may be a float (in seconds) or an instance of 11:21:45 :class:`urllib3.util.Timeout`. 11:21:45 11:21:45 :param pool_timeout: 11:21:45 If set and the pool is set to block=True, then this method will 11:21:45 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:45 connection is available within the time period. 11:21:45 11:21:45 :param bool preload_content: 11:21:45 If True, the response's body will be preloaded into memory. 11:21:45 11:21:45 :param bool decode_content: 11:21:45 If True, will attempt to decode the body based on the 11:21:45 'content-encoding' header. 11:21:45 11:21:45 :param release_conn: 11:21:45 If False, then the urlopen call will not release the connection 11:21:45 back into the pool once a response is received (but will release if 11:21:45 you read the entire contents of the response such as when 11:21:45 `preload_content=True`). This is useful if you're not preloading 11:21:45 the response's content immediately. You will need to call 11:21:45 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:45 back into the pool. If None, it takes the value of ``preload_content`` 11:21:45 which defaults to ``True``. 11:21:45 11:21:45 :param bool chunked: 11:21:45 If True, urllib3 will send the body using chunked transfer 11:21:45 encoding. Otherwise, urllib3 will send the body using the standard 11:21:45 content-length form. Defaults to False. 11:21:45 11:21:45 :param int body_pos: 11:21:45 Position to seek to in file-like body in the event of a retry or 11:21:45 redirect. Typically this won't need to be set because urllib3 will 11:21:45 auto-populate the value when needed. 11:21:45 """ 11:21:45 parsed_url = parse_url(url) 11:21:45 destination_scheme = parsed_url.scheme 11:21:45 11:21:45 if headers is None: 11:21:45 headers = self.headers 11:21:45 11:21:45 if not isinstance(retries, Retry): 11:21:45 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:45 11:21:45 if release_conn is None: 11:21:45 release_conn = preload_content 11:21:45 11:21:45 # Check host 11:21:45 if assert_same_host and not self.is_same_host(url): 11:21:45 raise HostChangedError(self, url, retries) 11:21:45 11:21:45 # Ensure that the URL we're connecting to is properly encoded 11:21:45 if url.startswith("/"): 11:21:45 url = to_str(_encode_target(url)) 11:21:45 else: 11:21:45 url = to_str(parsed_url.url) 11:21:45 11:21:45 conn = None 11:21:45 11:21:45 # Track whether `conn` needs to be released before 11:21:45 # returning/raising/recursing. Update this variable if necessary, and 11:21:45 # leave `release_conn` constant throughout the function. That way, if 11:21:45 # the function recurses, the original value of `release_conn` will be 11:21:45 # passed down into the recursive call, and its value will be respected. 11:21:45 # 11:21:45 # See issue #651 [1] for details. 11:21:45 # 11:21:45 # [1] 11:21:45 release_this_conn = release_conn 11:21:45 11:21:45 http_tunnel_required = connection_requires_http_tunnel( 11:21:45 self.proxy, self.proxy_config, destination_scheme 11:21:45 ) 11:21:45 11:21:45 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:45 # have to copy the headers dict so we can safely change it without those 11:21:45 # changes being reflected in anyone else's copy. 11:21:45 if not http_tunnel_required: 11:21:45 headers = headers.copy() # type: ignore[attr-defined] 11:21:45 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:45 11:21:45 # Must keep the exception bound to a separate variable or else Python 3 11:21:45 # complains about UnboundLocalError. 11:21:45 err = None 11:21:45 11:21:45 # Keep track of whether we cleanly exited the except block. This 11:21:45 # ensures we do proper cleanup in finally. 11:21:45 clean_exit = False 11:21:45 11:21:45 # Rewind body position, if needed. Record current position 11:21:45 # for future rewinds in the event of a redirect/retry. 11:21:45 body_pos = set_file_position(body, body_pos) 11:21:45 11:21:45 try: 11:21:45 # Request a connection from the queue. 11:21:45 timeout_obj = self._get_timeout(timeout) 11:21:45 conn = self._get_conn(timeout=pool_timeout) 11:21:45 11:21:45 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:45 11:21:45 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:45 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:45 try: 11:21:45 self._prepare_proxy(conn) 11:21:45 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:45 self._raise_timeout( 11:21:45 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:45 ) 11:21:45 raise 11:21:45 11:21:45 # If we're going to release the connection in ``finally:``, then 11:21:45 # the response doesn't need to know about the connection. Otherwise 11:21:45 # it will also try to release it and we'll have a double-release 11:21:45 # mess. 11:21:45 response_conn = conn if not release_conn else None 11:21:45 11:21:45 # Make the request on the HTTPConnection object 11:21:45 > response = self._make_request( 11:21:45 conn, 11:21:45 method, 11:21:45 url, 11:21:45 timeout=timeout_obj, 11:21:45 body=body, 11:21:45 headers=headers, 11:21:45 chunked=chunked, 11:21:45 retries=retries, 11:21:45 response_conn=response_conn, 11:21:45 preload_content=preload_content, 11:21:45 decode_content=decode_content, 11:21:45 **response_kw, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:45 conn.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:45 self.endheaders() 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:45 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:45 self.send(msg) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:45 self.connect() 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:45 self.sock = self._new_conn() 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 except socket.gaierror as e: 11:21:45 raise NameResolutionError(self.host, self, e) from e 11:21:45 except SocketTimeout as e: 11:21:45 raise ConnectTimeoutError( 11:21:45 self, 11:21:45 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:45 ) from e 11:21:45 11:21:45 except OSError as e: 11:21:45 > raise NewConnectionError( 11:21:45 self, f"Failed to establish a new connection: {e}" 11:21:45 ) from e 11:21:45 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 > resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:45 retries = retries.increment( 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 method = 'GET' 11:21:45 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01/yang-ext:mount/org-openroadm-device:org-openroadm-device/roadm-connections=SRG1-PP1-TXRX-DEG2-TTP-TXRX-761:768?content=nonconfig' 11:21:45 response = None 11:21:45 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:45 _pool = 11:21:45 _stacktrace = 11:21:45 11:21:45 def increment( 11:21:45 self, 11:21:45 method: str | None = None, 11:21:45 url: str | None = None, 11:21:45 response: BaseHTTPResponse | None = None, 11:21:45 error: Exception | None = None, 11:21:45 _pool: ConnectionPool | None = None, 11:21:45 _stacktrace: TracebackType | None = None, 11:21:45 ) -> Self: 11:21:45 """Return a new Retry object with incremented retry counters. 11:21:45 11:21:45 :param response: A response object, or None, if the server did not 11:21:45 return a response. 11:21:45 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:45 :param Exception error: An error encountered during the request, or 11:21:45 None if the response was received successfully. 11:21:45 11:21:45 :return: A new ``Retry`` object. 11:21:45 """ 11:21:45 if self.total is False and error: 11:21:45 # Disabled, indicate to re-raise the error. 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 11:21:45 total = self.total 11:21:45 if total is not None: 11:21:45 total -= 1 11:21:45 11:21:45 connect = self.connect 11:21:45 read = self.read 11:21:45 redirect = self.redirect 11:21:45 status_count = self.status 11:21:45 other = self.other 11:21:45 cause = "unknown" 11:21:45 status = None 11:21:45 redirect_location = None 11:21:45 11:21:45 if error and self._is_connection_error(error): 11:21:45 # Connect retry? 11:21:45 if connect is False: 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif connect is not None: 11:21:45 connect -= 1 11:21:45 11:21:45 elif error and self._is_read_error(error): 11:21:45 # Read retry? 11:21:45 if read is False or method is None or not self._is_method_retryable(method): 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif read is not None: 11:21:45 read -= 1 11:21:45 11:21:45 elif error: 11:21:45 # Other retry? 11:21:45 if other is not None: 11:21:45 other -= 1 11:21:45 11:21:45 elif response and response.get_redirect_location(): 11:21:45 # Redirect retry? 11:21:45 if redirect is not None: 11:21:45 redirect -= 1 11:21:45 cause = "too many redirects" 11:21:45 response_redirect_location = response.get_redirect_location() 11:21:45 if response_redirect_location: 11:21:45 redirect_location = response_redirect_location 11:21:45 status = response.status 11:21:45 11:21:45 else: 11:21:45 # Incrementing because of a server error like a 500 in 11:21:45 # status_forcelist and the given method is in the allowed_methods 11:21:45 cause = ResponseError.GENERIC_ERROR 11:21:45 if response and response.status: 11:21:45 if status_count is not None: 11:21:45 status_count -= 1 11:21:45 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:45 status = response.status 11:21:45 11:21:45 history = self.history + ( 11:21:45 RequestHistory(method, url, error, status, redirect_location), 11:21:45 ) 11:21:45 11:21:45 new_retry = self.new( 11:21:45 total=total, 11:21:45 connect=connect, 11:21:45 read=read, 11:21:45 redirect=redirect, 11:21:45 status=status_count, 11:21:45 other=other, 11:21:45 history=history, 11:21:45 ) 11:21:45 11:21:45 if new_retry.is_exhausted(): 11:21:45 reason = error or ResponseError(cause) 11:21:45 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:45 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01/yang-ext:mount/org-openroadm-device:org-openroadm-device/roadm-connections=SRG1-PP1-TXRX-DEG2-TTP-TXRX-761:768?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:45 11:21:45 During handling of the above exception, another exception occurred: 11:21:45 11:21:45 self = 11:21:45 11:21:45 def test_14_check_xc1_ROADMC(self): 11:21:45 > response = test_utils.check_node_attribute_request( 11:21:45 "ROADMC01", "roadm-connections", "SRG1-PP1-TXRX-DEG2-TTP-TXRX-761:768") 11:21:45 11:21:45 transportpce_tests/1.2.1/test06_end2end.py:214: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 transportpce_tests/common/test_utils.py:404: in check_node_attribute_request 11:21:45 response = get_request(url[RESTCONF_VERSION].format('{}', node, attribute, attribute_value)) 11:21:45 transportpce_tests/common/test_utils.py:116: in get_request 11:21:45 return requests.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:45 return session.request(method=method, url=url, **kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:45 resp = self.send(prep, **send_kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:45 r = adapter.send(request, **kwargs) 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 except (ProtocolError, OSError) as err: 11:21:45 raise ConnectionError(err, request=request) 11:21:45 11:21:45 except MaxRetryError as e: 11:21:45 if isinstance(e.reason, ConnectTimeoutError): 11:21:45 # TODO: Remove this in 3.0.0: see #2811 11:21:45 if not isinstance(e.reason, NewConnectionError): 11:21:45 raise ConnectTimeout(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, ResponseError): 11:21:45 raise RetryError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _ProxyError): 11:21:45 raise ProxyError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _SSLError): 11:21:45 # This branch is for urllib3 v1.22 and later. 11:21:45 raise SSLError(e, request=request) 11:21:45 11:21:45 > raise ConnectionError(e, request=request) 11:21:45 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01/yang-ext:mount/org-openroadm-device:org-openroadm-device/roadm-connections=SRG1-PP1-TXRX-DEG2-TTP-TXRX-761:768?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:45 ----------------------------- Captured stdout call ----------------------------- 11:21:45 execution of test_14_check_xc1_ROADMC 11:21:45 _______________ TransportPCEFulltesting.test_15_check_topo_XPDRA _______________ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 > sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:45 raise err 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:45 socket_options = [(6, 1, 1)] 11:21:45 11:21:45 def create_connection( 11:21:45 address: tuple[str, int], 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 source_address: tuple[str, int] | None = None, 11:21:45 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:45 ) -> socket.socket: 11:21:45 """Connect to *address* and return the socket object. 11:21:45 11:21:45 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:45 port)``) and return the socket object. Passing the optional 11:21:45 *timeout* parameter will set the timeout on the socket instance 11:21:45 before attempting to connect. If no *timeout* is supplied, the 11:21:45 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:45 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:45 for the socket to bind as a source address before making the connection. 11:21:45 An host of '' or port 0 tells the OS to use the default. 11:21:45 """ 11:21:45 11:21:45 host, port = address 11:21:45 if host.startswith("["): 11:21:45 host = host.strip("[]") 11:21:45 err = None 11:21:45 11:21:45 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:45 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:45 # The original create_connection function always returns all records. 11:21:45 family = allowed_gai_family() 11:21:45 11:21:45 try: 11:21:45 host.encode("idna") 11:21:45 except UnicodeError: 11:21:45 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:45 11:21:45 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:45 af, socktype, proto, canonname, sa = res 11:21:45 sock = None 11:21:45 try: 11:21:45 sock = socket.socket(af, socktype, proto) 11:21:45 11:21:45 # If provided, set socket level options before connecting. 11:21:45 _set_socket_options(sock, socket_options) 11:21:45 11:21:45 if timeout is not _DEFAULT_TIMEOUT: 11:21:45 sock.settimeout(timeout) 11:21:45 if source_address: 11:21:45 sock.bind(source_address) 11:21:45 > sock.connect(sa) 11:21:45 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 method = 'GET' 11:21:45 url = '/rests/data/ietf-network:networks/network=openroadm-topology/node=XPDRA01-XPDR1?content=config' 11:21:45 body = None 11:21:45 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:45 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 redirect = False, assert_same_host = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:45 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:45 decode_content = False, response_kw = {} 11:21:45 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/ietf-network:networks/network=openroadm-topology/node=XPDRA01-XPDR1', query='content=config', fragment=None) 11:21:45 destination_scheme = None, conn = None, release_this_conn = True 11:21:45 http_tunnel_required = False, err = None, clean_exit = False 11:21:45 11:21:45 def urlopen( # type: ignore[override] 11:21:45 self, 11:21:45 method: str, 11:21:45 url: str, 11:21:45 body: _TYPE_BODY | None = None, 11:21:45 headers: typing.Mapping[str, str] | None = None, 11:21:45 retries: Retry | bool | int | None = None, 11:21:45 redirect: bool = True, 11:21:45 assert_same_host: bool = True, 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 pool_timeout: int | None = None, 11:21:45 release_conn: bool | None = None, 11:21:45 chunked: bool = False, 11:21:45 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:45 preload_content: bool = True, 11:21:45 decode_content: bool = True, 11:21:45 **response_kw: typing.Any, 11:21:45 ) -> BaseHTTPResponse: 11:21:45 """ 11:21:45 Get a connection from the pool and perform an HTTP request. This is the 11:21:45 lowest level call for making a request, so you'll need to specify all 11:21:45 the raw details. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 More commonly, it's appropriate to use a convenience method 11:21:45 such as :meth:`request`. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 `release_conn` will only behave as expected if 11:21:45 `preload_content=False` because we want to make 11:21:45 `preload_content=False` the default behaviour someday soon without 11:21:45 breaking backwards compatibility. 11:21:45 11:21:45 :param method: 11:21:45 HTTP request method (such as GET, POST, PUT, etc.) 11:21:45 11:21:45 :param url: 11:21:45 The URL to perform the request on. 11:21:45 11:21:45 :param body: 11:21:45 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:45 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:45 11:21:45 :param headers: 11:21:45 Dictionary of custom headers to send, such as User-Agent, 11:21:45 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:45 these headers completely replace any pool-specific headers. 11:21:45 11:21:45 :param retries: 11:21:45 Configure the number of retries to allow before raising a 11:21:45 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:45 11:21:45 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:45 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:45 over different types of retries. 11:21:45 Pass an integer number to retry connection errors that many times, 11:21:45 but no other types of errors. Pass zero to never retry. 11:21:45 11:21:45 If ``False``, then retries are disabled and any exception is raised 11:21:45 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:45 the redirect response will be returned. 11:21:45 11:21:45 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:45 11:21:45 :param redirect: 11:21:45 If True, automatically handle redirects (status codes 301, 302, 11:21:45 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:45 will disable redirect, too. 11:21:45 11:21:45 :param assert_same_host: 11:21:45 If ``True``, will make sure that the host of the pool requests is 11:21:45 consistent else will raise HostChangedError. When ``False``, you can 11:21:45 use the pool on an HTTP proxy and request foreign hosts. 11:21:45 11:21:45 :param timeout: 11:21:45 If specified, overrides the default timeout for this one 11:21:45 request. It may be a float (in seconds) or an instance of 11:21:45 :class:`urllib3.util.Timeout`. 11:21:45 11:21:45 :param pool_timeout: 11:21:45 If set and the pool is set to block=True, then this method will 11:21:45 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:45 connection is available within the time period. 11:21:45 11:21:45 :param bool preload_content: 11:21:45 If True, the response's body will be preloaded into memory. 11:21:45 11:21:45 :param bool decode_content: 11:21:45 If True, will attempt to decode the body based on the 11:21:45 'content-encoding' header. 11:21:45 11:21:45 :param release_conn: 11:21:45 If False, then the urlopen call will not release the connection 11:21:45 back into the pool once a response is received (but will release if 11:21:45 you read the entire contents of the response such as when 11:21:45 `preload_content=True`). This is useful if you're not preloading 11:21:45 the response's content immediately. You will need to call 11:21:45 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:45 back into the pool. If None, it takes the value of ``preload_content`` 11:21:45 which defaults to ``True``. 11:21:45 11:21:45 :param bool chunked: 11:21:45 If True, urllib3 will send the body using chunked transfer 11:21:45 encoding. Otherwise, urllib3 will send the body using the standard 11:21:45 content-length form. Defaults to False. 11:21:45 11:21:45 :param int body_pos: 11:21:45 Position to seek to in file-like body in the event of a retry or 11:21:45 redirect. Typically this won't need to be set because urllib3 will 11:21:45 auto-populate the value when needed. 11:21:45 """ 11:21:45 parsed_url = parse_url(url) 11:21:45 destination_scheme = parsed_url.scheme 11:21:45 11:21:45 if headers is None: 11:21:45 headers = self.headers 11:21:45 11:21:45 if not isinstance(retries, Retry): 11:21:45 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:45 11:21:45 if release_conn is None: 11:21:45 release_conn = preload_content 11:21:45 11:21:45 # Check host 11:21:45 if assert_same_host and not self.is_same_host(url): 11:21:45 raise HostChangedError(self, url, retries) 11:21:45 11:21:45 # Ensure that the URL we're connecting to is properly encoded 11:21:45 if url.startswith("/"): 11:21:45 url = to_str(_encode_target(url)) 11:21:45 else: 11:21:45 url = to_str(parsed_url.url) 11:21:45 11:21:45 conn = None 11:21:45 11:21:45 # Track whether `conn` needs to be released before 11:21:45 # returning/raising/recursing. Update this variable if necessary, and 11:21:45 # leave `release_conn` constant throughout the function. That way, if 11:21:45 # the function recurses, the original value of `release_conn` will be 11:21:45 # passed down into the recursive call, and its value will be respected. 11:21:45 # 11:21:45 # See issue #651 [1] for details. 11:21:45 # 11:21:45 # [1] 11:21:45 release_this_conn = release_conn 11:21:45 11:21:45 http_tunnel_required = connection_requires_http_tunnel( 11:21:45 self.proxy, self.proxy_config, destination_scheme 11:21:45 ) 11:21:45 11:21:45 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:45 # have to copy the headers dict so we can safely change it without those 11:21:45 # changes being reflected in anyone else's copy. 11:21:45 if not http_tunnel_required: 11:21:45 headers = headers.copy() # type: ignore[attr-defined] 11:21:45 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:45 11:21:45 # Must keep the exception bound to a separate variable or else Python 3 11:21:45 # complains about UnboundLocalError. 11:21:45 err = None 11:21:45 11:21:45 # Keep track of whether we cleanly exited the except block. This 11:21:45 # ensures we do proper cleanup in finally. 11:21:45 clean_exit = False 11:21:45 11:21:45 # Rewind body position, if needed. Record current position 11:21:45 # for future rewinds in the event of a redirect/retry. 11:21:45 body_pos = set_file_position(body, body_pos) 11:21:45 11:21:45 try: 11:21:45 # Request a connection from the queue. 11:21:45 timeout_obj = self._get_timeout(timeout) 11:21:45 conn = self._get_conn(timeout=pool_timeout) 11:21:45 11:21:45 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:45 11:21:45 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:45 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:45 try: 11:21:45 self._prepare_proxy(conn) 11:21:45 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:45 self._raise_timeout( 11:21:45 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:45 ) 11:21:45 raise 11:21:45 11:21:45 # If we're going to release the connection in ``finally:``, then 11:21:45 # the response doesn't need to know about the connection. Otherwise 11:21:45 # it will also try to release it and we'll have a double-release 11:21:45 # mess. 11:21:45 response_conn = conn if not release_conn else None 11:21:45 11:21:45 # Make the request on the HTTPConnection object 11:21:45 > response = self._make_request( 11:21:45 conn, 11:21:45 method, 11:21:45 url, 11:21:45 timeout=timeout_obj, 11:21:45 body=body, 11:21:45 headers=headers, 11:21:45 chunked=chunked, 11:21:45 retries=retries, 11:21:45 response_conn=response_conn, 11:21:45 preload_content=preload_content, 11:21:45 decode_content=decode_content, 11:21:45 **response_kw, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:45 conn.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:45 self.endheaders() 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:45 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:45 self.send(msg) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:45 self.connect() 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:45 self.sock = self._new_conn() 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 except socket.gaierror as e: 11:21:45 raise NameResolutionError(self.host, self, e) from e 11:21:45 except SocketTimeout as e: 11:21:45 raise ConnectTimeoutError( 11:21:45 self, 11:21:45 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:45 ) from e 11:21:45 11:21:45 except OSError as e: 11:21:45 > raise NewConnectionError( 11:21:45 self, f"Failed to establish a new connection: {e}" 11:21:45 ) from e 11:21:45 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 > resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:45 retries = retries.increment( 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 method = 'GET' 11:21:45 url = '/rests/data/ietf-network:networks/network=openroadm-topology/node=XPDRA01-XPDR1?content=config' 11:21:45 response = None 11:21:45 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:45 _pool = 11:21:45 _stacktrace = 11:21:45 11:21:45 def increment( 11:21:45 self, 11:21:45 method: str | None = None, 11:21:45 url: str | None = None, 11:21:45 response: BaseHTTPResponse | None = None, 11:21:45 error: Exception | None = None, 11:21:45 _pool: ConnectionPool | None = None, 11:21:45 _stacktrace: TracebackType | None = None, 11:21:45 ) -> Self: 11:21:45 """Return a new Retry object with incremented retry counters. 11:21:45 11:21:45 :param response: A response object, or None, if the server did not 11:21:45 return a response. 11:21:45 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:45 :param Exception error: An error encountered during the request, or 11:21:45 None if the response was received successfully. 11:21:45 11:21:45 :return: A new ``Retry`` object. 11:21:45 """ 11:21:45 if self.total is False and error: 11:21:45 # Disabled, indicate to re-raise the error. 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 11:21:45 total = self.total 11:21:45 if total is not None: 11:21:45 total -= 1 11:21:45 11:21:45 connect = self.connect 11:21:45 read = self.read 11:21:45 redirect = self.redirect 11:21:45 status_count = self.status 11:21:45 other = self.other 11:21:45 cause = "unknown" 11:21:45 status = None 11:21:45 redirect_location = None 11:21:45 11:21:45 if error and self._is_connection_error(error): 11:21:45 # Connect retry? 11:21:45 if connect is False: 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif connect is not None: 11:21:45 connect -= 1 11:21:45 11:21:45 elif error and self._is_read_error(error): 11:21:45 # Read retry? 11:21:45 if read is False or method is None or not self._is_method_retryable(method): 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif read is not None: 11:21:45 read -= 1 11:21:45 11:21:45 elif error: 11:21:45 # Other retry? 11:21:45 if other is not None: 11:21:45 other -= 1 11:21:45 11:21:45 elif response and response.get_redirect_location(): 11:21:45 # Redirect retry? 11:21:45 if redirect is not None: 11:21:45 redirect -= 1 11:21:45 cause = "too many redirects" 11:21:45 response_redirect_location = response.get_redirect_location() 11:21:45 if response_redirect_location: 11:21:45 redirect_location = response_redirect_location 11:21:45 status = response.status 11:21:45 11:21:45 else: 11:21:45 # Incrementing because of a server error like a 500 in 11:21:45 # status_forcelist and the given method is in the allowed_methods 11:21:45 cause = ResponseError.GENERIC_ERROR 11:21:45 if response and response.status: 11:21:45 if status_count is not None: 11:21:45 status_count -= 1 11:21:45 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:45 status = response.status 11:21:45 11:21:45 history = self.history + ( 11:21:45 RequestHistory(method, url, error, status, redirect_location), 11:21:45 ) 11:21:45 11:21:45 new_retry = self.new( 11:21:45 total=total, 11:21:45 connect=connect, 11:21:45 read=read, 11:21:45 redirect=redirect, 11:21:45 status=status_count, 11:21:45 other=other, 11:21:45 history=history, 11:21:45 ) 11:21:45 11:21:45 if new_retry.is_exhausted(): 11:21:45 reason = error or ResponseError(cause) 11:21:45 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:45 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/node=XPDRA01-XPDR1?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:45 11:21:45 During handling of the above exception, another exception occurred: 11:21:45 11:21:45 self = 11:21:45 11:21:45 def test_15_check_topo_XPDRA(self): 11:21:45 > response = test_utils.get_ietf_network_node_request('openroadm-topology', 'XPDRA01-XPDR1', 'config') 11:21:45 11:21:45 transportpce_tests/1.2.1/test06_end2end.py:229: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 transportpce_tests/common/test_utils.py:583: in get_ietf_network_node_request 11:21:45 response = get_request(url[RESTCONF_VERSION].format(*format_args)) 11:21:45 transportpce_tests/common/test_utils.py:116: in get_request 11:21:45 return requests.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:45 return session.request(method=method, url=url, **kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:45 resp = self.send(prep, **send_kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:45 r = adapter.send(request, **kwargs) 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 except (ProtocolError, OSError) as err: 11:21:45 raise ConnectionError(err, request=request) 11:21:45 11:21:45 except MaxRetryError as e: 11:21:45 if isinstance(e.reason, ConnectTimeoutError): 11:21:45 # TODO: Remove this in 3.0.0: see #2811 11:21:45 if not isinstance(e.reason, NewConnectionError): 11:21:45 raise ConnectTimeout(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, ResponseError): 11:21:45 raise RetryError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _ProxyError): 11:21:45 raise ProxyError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _SSLError): 11:21:45 # This branch is for urllib3 v1.22 and later. 11:21:45 raise SSLError(e, request=request) 11:21:45 11:21:45 > raise ConnectionError(e, request=request) 11:21:45 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/node=XPDRA01-XPDR1?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:45 ----------------------------- Captured stdout call ----------------------------- 11:21:45 execution of test_15_check_topo_XPDRA 11:21:45 ____________ TransportPCEFulltesting.test_16_check_topo_ROADMA_SRG1 ____________ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 > sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:45 raise err 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:45 socket_options = [(6, 1, 1)] 11:21:45 11:21:45 def create_connection( 11:21:45 address: tuple[str, int], 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 source_address: tuple[str, int] | None = None, 11:21:45 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:45 ) -> socket.socket: 11:21:45 """Connect to *address* and return the socket object. 11:21:45 11:21:45 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:45 port)``) and return the socket object. Passing the optional 11:21:45 *timeout* parameter will set the timeout on the socket instance 11:21:45 before attempting to connect. If no *timeout* is supplied, the 11:21:45 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:45 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:45 for the socket to bind as a source address before making the connection. 11:21:45 An host of '' or port 0 tells the OS to use the default. 11:21:45 """ 11:21:45 11:21:45 host, port = address 11:21:45 if host.startswith("["): 11:21:45 host = host.strip("[]") 11:21:45 err = None 11:21:45 11:21:45 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:45 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:45 # The original create_connection function always returns all records. 11:21:45 family = allowed_gai_family() 11:21:45 11:21:45 try: 11:21:45 host.encode("idna") 11:21:45 except UnicodeError: 11:21:45 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:45 11:21:45 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:45 af, socktype, proto, canonname, sa = res 11:21:45 sock = None 11:21:45 try: 11:21:45 sock = socket.socket(af, socktype, proto) 11:21:45 11:21:45 # If provided, set socket level options before connecting. 11:21:45 _set_socket_options(sock, socket_options) 11:21:45 11:21:45 if timeout is not _DEFAULT_TIMEOUT: 11:21:45 sock.settimeout(timeout) 11:21:45 if source_address: 11:21:45 sock.bind(source_address) 11:21:45 > sock.connect(sa) 11:21:45 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 method = 'GET' 11:21:45 url = '/rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-SRG1?content=config' 11:21:45 body = None 11:21:45 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:45 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 redirect = False, assert_same_host = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:45 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:45 decode_content = False, response_kw = {} 11:21:45 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-SRG1', query='content=config', fragment=None) 11:21:45 destination_scheme = None, conn = None, release_this_conn = True 11:21:45 http_tunnel_required = False, err = None, clean_exit = False 11:21:45 11:21:45 def urlopen( # type: ignore[override] 11:21:45 self, 11:21:45 method: str, 11:21:45 url: str, 11:21:45 body: _TYPE_BODY | None = None, 11:21:45 headers: typing.Mapping[str, str] | None = None, 11:21:45 retries: Retry | bool | int | None = None, 11:21:45 redirect: bool = True, 11:21:45 assert_same_host: bool = True, 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 pool_timeout: int | None = None, 11:21:45 release_conn: bool | None = None, 11:21:45 chunked: bool = False, 11:21:45 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:45 preload_content: bool = True, 11:21:45 decode_content: bool = True, 11:21:45 **response_kw: typing.Any, 11:21:45 ) -> BaseHTTPResponse: 11:21:45 """ 11:21:45 Get a connection from the pool and perform an HTTP request. This is the 11:21:45 lowest level call for making a request, so you'll need to specify all 11:21:45 the raw details. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 More commonly, it's appropriate to use a convenience method 11:21:45 such as :meth:`request`. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 `release_conn` will only behave as expected if 11:21:45 `preload_content=False` because we want to make 11:21:45 `preload_content=False` the default behaviour someday soon without 11:21:45 breaking backwards compatibility. 11:21:45 11:21:45 :param method: 11:21:45 HTTP request method (such as GET, POST, PUT, etc.) 11:21:45 11:21:45 :param url: 11:21:45 The URL to perform the request on. 11:21:45 11:21:45 :param body: 11:21:45 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:45 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:45 11:21:45 :param headers: 11:21:45 Dictionary of custom headers to send, such as User-Agent, 11:21:45 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:45 these headers completely replace any pool-specific headers. 11:21:45 11:21:45 :param retries: 11:21:45 Configure the number of retries to allow before raising a 11:21:45 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:45 11:21:45 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:45 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:45 over different types of retries. 11:21:45 Pass an integer number to retry connection errors that many times, 11:21:45 but no other types of errors. Pass zero to never retry. 11:21:45 11:21:45 If ``False``, then retries are disabled and any exception is raised 11:21:45 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:45 the redirect response will be returned. 11:21:45 11:21:45 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:45 11:21:45 :param redirect: 11:21:45 If True, automatically handle redirects (status codes 301, 302, 11:21:45 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:45 will disable redirect, too. 11:21:45 11:21:45 :param assert_same_host: 11:21:45 If ``True``, will make sure that the host of the pool requests is 11:21:45 consistent else will raise HostChangedError. When ``False``, you can 11:21:45 use the pool on an HTTP proxy and request foreign hosts. 11:21:45 11:21:45 :param timeout: 11:21:45 If specified, overrides the default timeout for this one 11:21:45 request. It may be a float (in seconds) or an instance of 11:21:45 :class:`urllib3.util.Timeout`. 11:21:45 11:21:45 :param pool_timeout: 11:21:45 If set and the pool is set to block=True, then this method will 11:21:45 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:45 connection is available within the time period. 11:21:45 11:21:45 :param bool preload_content: 11:21:45 If True, the response's body will be preloaded into memory. 11:21:45 11:21:45 :param bool decode_content: 11:21:45 If True, will attempt to decode the body based on the 11:21:45 'content-encoding' header. 11:21:45 11:21:45 :param release_conn: 11:21:45 If False, then the urlopen call will not release the connection 11:21:45 back into the pool once a response is received (but will release if 11:21:45 you read the entire contents of the response such as when 11:21:45 `preload_content=True`). This is useful if you're not preloading 11:21:45 the response's content immediately. You will need to call 11:21:45 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:45 back into the pool. If None, it takes the value of ``preload_content`` 11:21:45 which defaults to ``True``. 11:21:45 11:21:45 :param bool chunked: 11:21:45 If True, urllib3 will send the body using chunked transfer 11:21:45 encoding. Otherwise, urllib3 will send the body using the standard 11:21:45 content-length form. Defaults to False. 11:21:45 11:21:45 :param int body_pos: 11:21:45 Position to seek to in file-like body in the event of a retry or 11:21:45 redirect. Typically this won't need to be set because urllib3 will 11:21:45 auto-populate the value when needed. 11:21:45 """ 11:21:45 parsed_url = parse_url(url) 11:21:45 destination_scheme = parsed_url.scheme 11:21:45 11:21:45 if headers is None: 11:21:45 headers = self.headers 11:21:45 11:21:45 if not isinstance(retries, Retry): 11:21:45 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:45 11:21:45 if release_conn is None: 11:21:45 release_conn = preload_content 11:21:45 11:21:45 # Check host 11:21:45 if assert_same_host and not self.is_same_host(url): 11:21:45 raise HostChangedError(self, url, retries) 11:21:45 11:21:45 # Ensure that the URL we're connecting to is properly encoded 11:21:45 if url.startswith("/"): 11:21:45 url = to_str(_encode_target(url)) 11:21:45 else: 11:21:45 url = to_str(parsed_url.url) 11:21:45 11:21:45 conn = None 11:21:45 11:21:45 # Track whether `conn` needs to be released before 11:21:45 # returning/raising/recursing. Update this variable if necessary, and 11:21:45 # leave `release_conn` constant throughout the function. That way, if 11:21:45 # the function recurses, the original value of `release_conn` will be 11:21:45 # passed down into the recursive call, and its value will be respected. 11:21:45 # 11:21:45 # See issue #651 [1] for details. 11:21:45 # 11:21:45 # [1] 11:21:45 release_this_conn = release_conn 11:21:45 11:21:45 http_tunnel_required = connection_requires_http_tunnel( 11:21:45 self.proxy, self.proxy_config, destination_scheme 11:21:45 ) 11:21:45 11:21:45 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:45 # have to copy the headers dict so we can safely change it without those 11:21:45 # changes being reflected in anyone else's copy. 11:21:45 if not http_tunnel_required: 11:21:45 headers = headers.copy() # type: ignore[attr-defined] 11:21:45 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:45 11:21:45 # Must keep the exception bound to a separate variable or else Python 3 11:21:45 # complains about UnboundLocalError. 11:21:45 err = None 11:21:45 11:21:45 # Keep track of whether we cleanly exited the except block. This 11:21:45 # ensures we do proper cleanup in finally. 11:21:45 clean_exit = False 11:21:45 11:21:45 # Rewind body position, if needed. Record current position 11:21:45 # for future rewinds in the event of a redirect/retry. 11:21:45 body_pos = set_file_position(body, body_pos) 11:21:45 11:21:45 try: 11:21:45 # Request a connection from the queue. 11:21:45 timeout_obj = self._get_timeout(timeout) 11:21:45 conn = self._get_conn(timeout=pool_timeout) 11:21:45 11:21:45 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:45 11:21:45 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:45 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:45 try: 11:21:45 self._prepare_proxy(conn) 11:21:45 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:45 self._raise_timeout( 11:21:45 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:45 ) 11:21:45 raise 11:21:45 11:21:45 # If we're going to release the connection in ``finally:``, then 11:21:45 # the response doesn't need to know about the connection. Otherwise 11:21:45 # it will also try to release it and we'll have a double-release 11:21:45 # mess. 11:21:45 response_conn = conn if not release_conn else None 11:21:45 11:21:45 # Make the request on the HTTPConnection object 11:21:45 > response = self._make_request( 11:21:45 conn, 11:21:45 method, 11:21:45 url, 11:21:45 timeout=timeout_obj, 11:21:45 body=body, 11:21:45 headers=headers, 11:21:45 chunked=chunked, 11:21:45 retries=retries, 11:21:45 response_conn=response_conn, 11:21:45 preload_content=preload_content, 11:21:45 decode_content=decode_content, 11:21:45 **response_kw, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:45 conn.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:45 self.endheaders() 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:45 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:45 self.send(msg) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:45 self.connect() 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:45 self.sock = self._new_conn() 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 except socket.gaierror as e: 11:21:45 raise NameResolutionError(self.host, self, e) from e 11:21:45 except SocketTimeout as e: 11:21:45 raise ConnectTimeoutError( 11:21:45 self, 11:21:45 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:45 ) from e 11:21:45 11:21:45 except OSError as e: 11:21:45 > raise NewConnectionError( 11:21:45 self, f"Failed to establish a new connection: {e}" 11:21:45 ) from e 11:21:45 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 > resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:45 retries = retries.increment( 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 method = 'GET' 11:21:45 url = '/rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-SRG1?content=config' 11:21:45 response = None 11:21:45 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:45 _pool = 11:21:45 _stacktrace = 11:21:45 11:21:45 def increment( 11:21:45 self, 11:21:45 method: str | None = None, 11:21:45 url: str | None = None, 11:21:45 response: BaseHTTPResponse | None = None, 11:21:45 error: Exception | None = None, 11:21:45 _pool: ConnectionPool | None = None, 11:21:45 _stacktrace: TracebackType | None = None, 11:21:45 ) -> Self: 11:21:45 """Return a new Retry object with incremented retry counters. 11:21:45 11:21:45 :param response: A response object, or None, if the server did not 11:21:45 return a response. 11:21:45 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:45 :param Exception error: An error encountered during the request, or 11:21:45 None if the response was received successfully. 11:21:45 11:21:45 :return: A new ``Retry`` object. 11:21:45 """ 11:21:45 if self.total is False and error: 11:21:45 # Disabled, indicate to re-raise the error. 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 11:21:45 total = self.total 11:21:45 if total is not None: 11:21:45 total -= 1 11:21:45 11:21:45 connect = self.connect 11:21:45 read = self.read 11:21:45 redirect = self.redirect 11:21:45 status_count = self.status 11:21:45 other = self.other 11:21:45 cause = "unknown" 11:21:45 status = None 11:21:45 redirect_location = None 11:21:45 11:21:45 if error and self._is_connection_error(error): 11:21:45 # Connect retry? 11:21:45 if connect is False: 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif connect is not None: 11:21:45 connect -= 1 11:21:45 11:21:45 elif error and self._is_read_error(error): 11:21:45 # Read retry? 11:21:45 if read is False or method is None or not self._is_method_retryable(method): 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif read is not None: 11:21:45 read -= 1 11:21:45 11:21:45 elif error: 11:21:45 # Other retry? 11:21:45 if other is not None: 11:21:45 other -= 1 11:21:45 11:21:45 elif response and response.get_redirect_location(): 11:21:45 # Redirect retry? 11:21:45 if redirect is not None: 11:21:45 redirect -= 1 11:21:45 cause = "too many redirects" 11:21:45 response_redirect_location = response.get_redirect_location() 11:21:45 if response_redirect_location: 11:21:45 redirect_location = response_redirect_location 11:21:45 status = response.status 11:21:45 11:21:45 else: 11:21:45 # Incrementing because of a server error like a 500 in 11:21:45 # status_forcelist and the given method is in the allowed_methods 11:21:45 cause = ResponseError.GENERIC_ERROR 11:21:45 if response and response.status: 11:21:45 if status_count is not None: 11:21:45 status_count -= 1 11:21:45 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:45 status = response.status 11:21:45 11:21:45 history = self.history + ( 11:21:45 RequestHistory(method, url, error, status, redirect_location), 11:21:45 ) 11:21:45 11:21:45 new_retry = self.new( 11:21:45 total=total, 11:21:45 connect=connect, 11:21:45 read=read, 11:21:45 redirect=redirect, 11:21:45 status=status_count, 11:21:45 other=other, 11:21:45 history=history, 11:21:45 ) 11:21:45 11:21:45 if new_retry.is_exhausted(): 11:21:45 reason = error or ResponseError(cause) 11:21:45 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:45 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-SRG1?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:45 11:21:45 During handling of the above exception, another exception occurred: 11:21:45 11:21:45 self = 11:21:45 11:21:45 def test_16_check_topo_ROADMA_SRG1(self): 11:21:45 > response = test_utils.get_ietf_network_node_request('openroadm-topology', 'ROADMA01-SRG1', 'config') 11:21:45 11:21:45 transportpce_tests/1.2.1/test06_end2end.py:247: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 transportpce_tests/common/test_utils.py:583: in get_ietf_network_node_request 11:21:45 response = get_request(url[RESTCONF_VERSION].format(*format_args)) 11:21:45 transportpce_tests/common/test_utils.py:116: in get_request 11:21:45 return requests.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:45 return session.request(method=method, url=url, **kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:45 resp = self.send(prep, **send_kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:45 r = adapter.send(request, **kwargs) 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 except (ProtocolError, OSError) as err: 11:21:45 raise ConnectionError(err, request=request) 11:21:45 11:21:45 except MaxRetryError as e: 11:21:45 if isinstance(e.reason, ConnectTimeoutError): 11:21:45 # TODO: Remove this in 3.0.0: see #2811 11:21:45 if not isinstance(e.reason, NewConnectionError): 11:21:45 raise ConnectTimeout(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, ResponseError): 11:21:45 raise RetryError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _ProxyError): 11:21:45 raise ProxyError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _SSLError): 11:21:45 # This branch is for urllib3 v1.22 and later. 11:21:45 raise SSLError(e, request=request) 11:21:45 11:21:45 > raise ConnectionError(e, request=request) 11:21:45 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-SRG1?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:45 ----------------------------- Captured stdout call ----------------------------- 11:21:45 execution of test_16_check_topo_ROADMA_SRG1 11:21:45 ____________ TransportPCEFulltesting.test_17_check_topo_ROADMA_DEG1 ____________ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 > sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:45 raise err 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:45 socket_options = [(6, 1, 1)] 11:21:45 11:21:45 def create_connection( 11:21:45 address: tuple[str, int], 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 source_address: tuple[str, int] | None = None, 11:21:45 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:45 ) -> socket.socket: 11:21:45 """Connect to *address* and return the socket object. 11:21:45 11:21:45 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:45 port)``) and return the socket object. Passing the optional 11:21:45 *timeout* parameter will set the timeout on the socket instance 11:21:45 before attempting to connect. If no *timeout* is supplied, the 11:21:45 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:45 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:45 for the socket to bind as a source address before making the connection. 11:21:45 An host of '' or port 0 tells the OS to use the default. 11:21:45 """ 11:21:45 11:21:45 host, port = address 11:21:45 if host.startswith("["): 11:21:45 host = host.strip("[]") 11:21:45 err = None 11:21:45 11:21:45 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:45 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:45 # The original create_connection function always returns all records. 11:21:45 family = allowed_gai_family() 11:21:45 11:21:45 try: 11:21:45 host.encode("idna") 11:21:45 except UnicodeError: 11:21:45 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:45 11:21:45 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:45 af, socktype, proto, canonname, sa = res 11:21:45 sock = None 11:21:45 try: 11:21:45 sock = socket.socket(af, socktype, proto) 11:21:45 11:21:45 # If provided, set socket level options before connecting. 11:21:45 _set_socket_options(sock, socket_options) 11:21:45 11:21:45 if timeout is not _DEFAULT_TIMEOUT: 11:21:45 sock.settimeout(timeout) 11:21:45 if source_address: 11:21:45 sock.bind(source_address) 11:21:45 > sock.connect(sa) 11:21:45 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 method = 'GET' 11:21:45 url = '/rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-DEG1?content=config' 11:21:45 body = None 11:21:45 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:45 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 redirect = False, assert_same_host = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:45 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:45 decode_content = False, response_kw = {} 11:21:45 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-DEG1', query='content=config', fragment=None) 11:21:45 destination_scheme = None, conn = None, release_this_conn = True 11:21:45 http_tunnel_required = False, err = None, clean_exit = False 11:21:45 11:21:45 def urlopen( # type: ignore[override] 11:21:45 self, 11:21:45 method: str, 11:21:45 url: str, 11:21:45 body: _TYPE_BODY | None = None, 11:21:45 headers: typing.Mapping[str, str] | None = None, 11:21:45 retries: Retry | bool | int | None = None, 11:21:45 redirect: bool = True, 11:21:45 assert_same_host: bool = True, 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 pool_timeout: int | None = None, 11:21:45 release_conn: bool | None = None, 11:21:45 chunked: bool = False, 11:21:45 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:45 preload_content: bool = True, 11:21:45 decode_content: bool = True, 11:21:45 **response_kw: typing.Any, 11:21:45 ) -> BaseHTTPResponse: 11:21:45 """ 11:21:45 Get a connection from the pool and perform an HTTP request. This is the 11:21:45 lowest level call for making a request, so you'll need to specify all 11:21:45 the raw details. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 More commonly, it's appropriate to use a convenience method 11:21:45 such as :meth:`request`. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 `release_conn` will only behave as expected if 11:21:45 `preload_content=False` because we want to make 11:21:45 `preload_content=False` the default behaviour someday soon without 11:21:45 breaking backwards compatibility. 11:21:45 11:21:45 :param method: 11:21:45 HTTP request method (such as GET, POST, PUT, etc.) 11:21:45 11:21:45 :param url: 11:21:45 The URL to perform the request on. 11:21:45 11:21:45 :param body: 11:21:45 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:45 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:45 11:21:45 :param headers: 11:21:45 Dictionary of custom headers to send, such as User-Agent, 11:21:45 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:45 these headers completely replace any pool-specific headers. 11:21:45 11:21:45 :param retries: 11:21:45 Configure the number of retries to allow before raising a 11:21:45 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:45 11:21:45 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:45 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:45 over different types of retries. 11:21:45 Pass an integer number to retry connection errors that many times, 11:21:45 but no other types of errors. Pass zero to never retry. 11:21:45 11:21:45 If ``False``, then retries are disabled and any exception is raised 11:21:45 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:45 the redirect response will be returned. 11:21:45 11:21:45 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:45 11:21:45 :param redirect: 11:21:45 If True, automatically handle redirects (status codes 301, 302, 11:21:45 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:45 will disable redirect, too. 11:21:45 11:21:45 :param assert_same_host: 11:21:45 If ``True``, will make sure that the host of the pool requests is 11:21:45 consistent else will raise HostChangedError. When ``False``, you can 11:21:45 use the pool on an HTTP proxy and request foreign hosts. 11:21:45 11:21:45 :param timeout: 11:21:45 If specified, overrides the default timeout for this one 11:21:45 request. It may be a float (in seconds) or an instance of 11:21:45 :class:`urllib3.util.Timeout`. 11:21:45 11:21:45 :param pool_timeout: 11:21:45 If set and the pool is set to block=True, then this method will 11:21:45 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:45 connection is available within the time period. 11:21:45 11:21:45 :param bool preload_content: 11:21:45 If True, the response's body will be preloaded into memory. 11:21:45 11:21:45 :param bool decode_content: 11:21:45 If True, will attempt to decode the body based on the 11:21:45 'content-encoding' header. 11:21:45 11:21:45 :param release_conn: 11:21:45 If False, then the urlopen call will not release the connection 11:21:45 back into the pool once a response is received (but will release if 11:21:45 you read the entire contents of the response such as when 11:21:45 `preload_content=True`). This is useful if you're not preloading 11:21:45 the response's content immediately. You will need to call 11:21:45 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:45 back into the pool. If None, it takes the value of ``preload_content`` 11:21:45 which defaults to ``True``. 11:21:45 11:21:45 :param bool chunked: 11:21:45 If True, urllib3 will send the body using chunked transfer 11:21:45 encoding. Otherwise, urllib3 will send the body using the standard 11:21:45 content-length form. Defaults to False. 11:21:45 11:21:45 :param int body_pos: 11:21:45 Position to seek to in file-like body in the event of a retry or 11:21:45 redirect. Typically this won't need to be set because urllib3 will 11:21:45 auto-populate the value when needed. 11:21:45 """ 11:21:45 parsed_url = parse_url(url) 11:21:45 destination_scheme = parsed_url.scheme 11:21:45 11:21:45 if headers is None: 11:21:45 headers = self.headers 11:21:45 11:21:45 if not isinstance(retries, Retry): 11:21:45 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:45 11:21:45 if release_conn is None: 11:21:45 release_conn = preload_content 11:21:45 11:21:45 # Check host 11:21:45 if assert_same_host and not self.is_same_host(url): 11:21:45 raise HostChangedError(self, url, retries) 11:21:45 11:21:45 # Ensure that the URL we're connecting to is properly encoded 11:21:45 if url.startswith("/"): 11:21:45 url = to_str(_encode_target(url)) 11:21:45 else: 11:21:45 url = to_str(parsed_url.url) 11:21:45 11:21:45 conn = None 11:21:45 11:21:45 # Track whether `conn` needs to be released before 11:21:45 # returning/raising/recursing. Update this variable if necessary, and 11:21:45 # leave `release_conn` constant throughout the function. That way, if 11:21:45 # the function recurses, the original value of `release_conn` will be 11:21:45 # passed down into the recursive call, and its value will be respected. 11:21:45 # 11:21:45 # See issue #651 [1] for details. 11:21:45 # 11:21:45 # [1] 11:21:45 release_this_conn = release_conn 11:21:45 11:21:45 http_tunnel_required = connection_requires_http_tunnel( 11:21:45 self.proxy, self.proxy_config, destination_scheme 11:21:45 ) 11:21:45 11:21:45 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:45 # have to copy the headers dict so we can safely change it without those 11:21:45 # changes being reflected in anyone else's copy. 11:21:45 if not http_tunnel_required: 11:21:45 headers = headers.copy() # type: ignore[attr-defined] 11:21:45 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:45 11:21:45 # Must keep the exception bound to a separate variable or else Python 3 11:21:45 # complains about UnboundLocalError. 11:21:45 err = None 11:21:45 11:21:45 # Keep track of whether we cleanly exited the except block. This 11:21:45 # ensures we do proper cleanup in finally. 11:21:45 clean_exit = False 11:21:45 11:21:45 # Rewind body position, if needed. Record current position 11:21:45 # for future rewinds in the event of a redirect/retry. 11:21:45 body_pos = set_file_position(body, body_pos) 11:21:45 11:21:45 try: 11:21:45 # Request a connection from the queue. 11:21:45 timeout_obj = self._get_timeout(timeout) 11:21:45 conn = self._get_conn(timeout=pool_timeout) 11:21:45 11:21:45 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:45 11:21:45 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:45 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:45 try: 11:21:45 self._prepare_proxy(conn) 11:21:45 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:45 self._raise_timeout( 11:21:45 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:45 ) 11:21:45 raise 11:21:45 11:21:45 # If we're going to release the connection in ``finally:``, then 11:21:45 # the response doesn't need to know about the connection. Otherwise 11:21:45 # it will also try to release it and we'll have a double-release 11:21:45 # mess. 11:21:45 response_conn = conn if not release_conn else None 11:21:45 11:21:45 # Make the request on the HTTPConnection object 11:21:45 > response = self._make_request( 11:21:45 conn, 11:21:45 method, 11:21:45 url, 11:21:45 timeout=timeout_obj, 11:21:45 body=body, 11:21:45 headers=headers, 11:21:45 chunked=chunked, 11:21:45 retries=retries, 11:21:45 response_conn=response_conn, 11:21:45 preload_content=preload_content, 11:21:45 decode_content=decode_content, 11:21:45 **response_kw, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:45 conn.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:45 self.endheaders() 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:45 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:45 self.send(msg) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:45 self.connect() 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:45 self.sock = self._new_conn() 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 except socket.gaierror as e: 11:21:45 raise NameResolutionError(self.host, self, e) from e 11:21:45 except SocketTimeout as e: 11:21:45 raise ConnectTimeoutError( 11:21:45 self, 11:21:45 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:45 ) from e 11:21:45 11:21:45 except OSError as e: 11:21:45 > raise NewConnectionError( 11:21:45 self, f"Failed to establish a new connection: {e}" 11:21:45 ) from e 11:21:45 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 > resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:45 retries = retries.increment( 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 method = 'GET' 11:21:45 url = '/rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-DEG1?content=config' 11:21:45 response = None 11:21:45 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:45 _pool = 11:21:45 _stacktrace = 11:21:45 11:21:45 def increment( 11:21:45 self, 11:21:45 method: str | None = None, 11:21:45 url: str | None = None, 11:21:45 response: BaseHTTPResponse | None = None, 11:21:45 error: Exception | None = None, 11:21:45 _pool: ConnectionPool | None = None, 11:21:45 _stacktrace: TracebackType | None = None, 11:21:45 ) -> Self: 11:21:45 """Return a new Retry object with incremented retry counters. 11:21:45 11:21:45 :param response: A response object, or None, if the server did not 11:21:45 return a response. 11:21:45 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:45 :param Exception error: An error encountered during the request, or 11:21:45 None if the response was received successfully. 11:21:45 11:21:45 :return: A new ``Retry`` object. 11:21:45 """ 11:21:45 if self.total is False and error: 11:21:45 # Disabled, indicate to re-raise the error. 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 11:21:45 total = self.total 11:21:45 if total is not None: 11:21:45 total -= 1 11:21:45 11:21:45 connect = self.connect 11:21:45 read = self.read 11:21:45 redirect = self.redirect 11:21:45 status_count = self.status 11:21:45 other = self.other 11:21:45 cause = "unknown" 11:21:45 status = None 11:21:45 redirect_location = None 11:21:45 11:21:45 if error and self._is_connection_error(error): 11:21:45 # Connect retry? 11:21:45 if connect is False: 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif connect is not None: 11:21:45 connect -= 1 11:21:45 11:21:45 elif error and self._is_read_error(error): 11:21:45 # Read retry? 11:21:45 if read is False or method is None or not self._is_method_retryable(method): 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif read is not None: 11:21:45 read -= 1 11:21:45 11:21:45 elif error: 11:21:45 # Other retry? 11:21:45 if other is not None: 11:21:45 other -= 1 11:21:45 11:21:45 elif response and response.get_redirect_location(): 11:21:45 # Redirect retry? 11:21:45 if redirect is not None: 11:21:45 redirect -= 1 11:21:45 cause = "too many redirects" 11:21:45 response_redirect_location = response.get_redirect_location() 11:21:45 if response_redirect_location: 11:21:45 redirect_location = response_redirect_location 11:21:45 status = response.status 11:21:45 11:21:45 else: 11:21:45 # Incrementing because of a server error like a 500 in 11:21:45 # status_forcelist and the given method is in the allowed_methods 11:21:45 cause = ResponseError.GENERIC_ERROR 11:21:45 if response and response.status: 11:21:45 if status_count is not None: 11:21:45 status_count -= 1 11:21:45 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:45 status = response.status 11:21:45 11:21:45 history = self.history + ( 11:21:45 RequestHistory(method, url, error, status, redirect_location), 11:21:45 ) 11:21:45 11:21:45 new_retry = self.new( 11:21:45 total=total, 11:21:45 connect=connect, 11:21:45 read=read, 11:21:45 redirect=redirect, 11:21:45 status=status_count, 11:21:45 other=other, 11:21:45 history=history, 11:21:45 ) 11:21:45 11:21:45 if new_retry.is_exhausted(): 11:21:45 reason = error or ResponseError(cause) 11:21:45 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:45 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-DEG1?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:45 11:21:45 During handling of the above exception, another exception occurred: 11:21:45 11:21:45 self = 11:21:45 11:21:45 def test_17_check_topo_ROADMA_DEG1(self): 11:21:45 > response = test_utils.get_ietf_network_node_request('openroadm-topology', 'ROADMA01-DEG1', 'config') 11:21:45 11:21:45 transportpce_tests/1.2.1/test06_end2end.py:265: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 transportpce_tests/common/test_utils.py:583: in get_ietf_network_node_request 11:21:45 response = get_request(url[RESTCONF_VERSION].format(*format_args)) 11:21:45 transportpce_tests/common/test_utils.py:116: in get_request 11:21:45 return requests.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:45 return session.request(method=method, url=url, **kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:45 resp = self.send(prep, **send_kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:45 r = adapter.send(request, **kwargs) 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 except (ProtocolError, OSError) as err: 11:21:45 raise ConnectionError(err, request=request) 11:21:45 11:21:45 except MaxRetryError as e: 11:21:45 if isinstance(e.reason, ConnectTimeoutError): 11:21:45 # TODO: Remove this in 3.0.0: see #2811 11:21:45 if not isinstance(e.reason, NewConnectionError): 11:21:45 raise ConnectTimeout(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, ResponseError): 11:21:45 raise RetryError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _ProxyError): 11:21:45 raise ProxyError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _SSLError): 11:21:45 # This branch is for urllib3 v1.22 and later. 11:21:45 raise SSLError(e, request=request) 11:21:45 11:21:45 > raise ConnectionError(e, request=request) 11:21:45 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-DEG1?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:45 ----------------------------- Captured stdout call ----------------------------- 11:21:45 execution of test_17_check_topo_ROADMA_DEG1 11:21:45 ________ TransportPCEFulltesting.test_18_connect_xpdrA_N2_to_roadmA_PP2 ________ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 > sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:45 raise err 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:45 socket_options = [(6, 1, 1)] 11:21:45 11:21:45 def create_connection( 11:21:45 address: tuple[str, int], 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 source_address: tuple[str, int] | None = None, 11:21:45 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:45 ) -> socket.socket: 11:21:45 """Connect to *address* and return the socket object. 11:21:45 11:21:45 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:45 port)``) and return the socket object. Passing the optional 11:21:45 *timeout* parameter will set the timeout on the socket instance 11:21:45 before attempting to connect. If no *timeout* is supplied, the 11:21:45 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:45 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:45 for the socket to bind as a source address before making the connection. 11:21:45 An host of '' or port 0 tells the OS to use the default. 11:21:45 """ 11:21:45 11:21:45 host, port = address 11:21:45 if host.startswith("["): 11:21:45 host = host.strip("[]") 11:21:45 err = None 11:21:45 11:21:45 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:45 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:45 # The original create_connection function always returns all records. 11:21:45 family = allowed_gai_family() 11:21:45 11:21:45 try: 11:21:45 host.encode("idna") 11:21:45 except UnicodeError: 11:21:45 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:45 11:21:45 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:45 af, socktype, proto, canonname, sa = res 11:21:45 sock = None 11:21:45 try: 11:21:45 sock = socket.socket(af, socktype, proto) 11:21:45 11:21:45 # If provided, set socket level options before connecting. 11:21:45 _set_socket_options(sock, socket_options) 11:21:45 11:21:45 if timeout is not _DEFAULT_TIMEOUT: 11:21:45 sock.settimeout(timeout) 11:21:45 if source_address: 11:21:45 sock.bind(source_address) 11:21:45 > sock.connect(sa) 11:21:45 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 method = 'POST' 11:21:45 url = '/rests/operations/transportpce-networkutils:init-xpdr-rdm-links' 11:21:45 body = '{"input": {"links-input": {"xpdr-node": "XPDRA01", "xpdr-num": "1", "network-num": "2", "rdm-node": "ROADMA01", "srg-num": "1", "termination-point-num": "SRG1-PP2-TXRX"}}}' 11:21:45 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '171', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:45 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 redirect = False, assert_same_host = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:45 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:45 decode_content = False, response_kw = {} 11:21:45 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-networkutils:init-xpdr-rdm-links', query=None, fragment=None) 11:21:45 destination_scheme = None, conn = None, release_this_conn = True 11:21:45 http_tunnel_required = False, err = None, clean_exit = False 11:21:45 11:21:45 def urlopen( # type: ignore[override] 11:21:45 self, 11:21:45 method: str, 11:21:45 url: str, 11:21:45 body: _TYPE_BODY | None = None, 11:21:45 headers: typing.Mapping[str, str] | None = None, 11:21:45 retries: Retry | bool | int | None = None, 11:21:45 redirect: bool = True, 11:21:45 assert_same_host: bool = True, 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 pool_timeout: int | None = None, 11:21:45 release_conn: bool | None = None, 11:21:45 chunked: bool = False, 11:21:45 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:45 preload_content: bool = True, 11:21:45 decode_content: bool = True, 11:21:45 **response_kw: typing.Any, 11:21:45 ) -> BaseHTTPResponse: 11:21:45 """ 11:21:45 Get a connection from the pool and perform an HTTP request. This is the 11:21:45 lowest level call for making a request, so you'll need to specify all 11:21:45 the raw details. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 More commonly, it's appropriate to use a convenience method 11:21:45 such as :meth:`request`. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 `release_conn` will only behave as expected if 11:21:45 `preload_content=False` because we want to make 11:21:45 `preload_content=False` the default behaviour someday soon without 11:21:45 breaking backwards compatibility. 11:21:45 11:21:45 :param method: 11:21:45 HTTP request method (such as GET, POST, PUT, etc.) 11:21:45 11:21:45 :param url: 11:21:45 The URL to perform the request on. 11:21:45 11:21:45 :param body: 11:21:45 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:45 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:45 11:21:45 :param headers: 11:21:45 Dictionary of custom headers to send, such as User-Agent, 11:21:45 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:45 these headers completely replace any pool-specific headers. 11:21:45 11:21:45 :param retries: 11:21:45 Configure the number of retries to allow before raising a 11:21:45 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:45 11:21:45 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:45 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:45 over different types of retries. 11:21:45 Pass an integer number to retry connection errors that many times, 11:21:45 but no other types of errors. Pass zero to never retry. 11:21:45 11:21:45 If ``False``, then retries are disabled and any exception is raised 11:21:45 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:45 the redirect response will be returned. 11:21:45 11:21:45 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:45 11:21:45 :param redirect: 11:21:45 If True, automatically handle redirects (status codes 301, 302, 11:21:45 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:45 will disable redirect, too. 11:21:45 11:21:45 :param assert_same_host: 11:21:45 If ``True``, will make sure that the host of the pool requests is 11:21:45 consistent else will raise HostChangedError. When ``False``, you can 11:21:45 use the pool on an HTTP proxy and request foreign hosts. 11:21:45 11:21:45 :param timeout: 11:21:45 If specified, overrides the default timeout for this one 11:21:45 request. It may be a float (in seconds) or an instance of 11:21:45 :class:`urllib3.util.Timeout`. 11:21:45 11:21:45 :param pool_timeout: 11:21:45 If set and the pool is set to block=True, then this method will 11:21:45 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:45 connection is available within the time period. 11:21:45 11:21:45 :param bool preload_content: 11:21:45 If True, the response's body will be preloaded into memory. 11:21:45 11:21:45 :param bool decode_content: 11:21:45 If True, will attempt to decode the body based on the 11:21:45 'content-encoding' header. 11:21:45 11:21:45 :param release_conn: 11:21:45 If False, then the urlopen call will not release the connection 11:21:45 back into the pool once a response is received (but will release if 11:21:45 you read the entire contents of the response such as when 11:21:45 `preload_content=True`). This is useful if you're not preloading 11:21:45 the response's content immediately. You will need to call 11:21:45 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:45 back into the pool. If None, it takes the value of ``preload_content`` 11:21:45 which defaults to ``True``. 11:21:45 11:21:45 :param bool chunked: 11:21:45 If True, urllib3 will send the body using chunked transfer 11:21:45 encoding. Otherwise, urllib3 will send the body using the standard 11:21:45 content-length form. Defaults to False. 11:21:45 11:21:45 :param int body_pos: 11:21:45 Position to seek to in file-like body in the event of a retry or 11:21:45 redirect. Typically this won't need to be set because urllib3 will 11:21:45 auto-populate the value when needed. 11:21:45 """ 11:21:45 parsed_url = parse_url(url) 11:21:45 destination_scheme = parsed_url.scheme 11:21:45 11:21:45 if headers is None: 11:21:45 headers = self.headers 11:21:45 11:21:45 if not isinstance(retries, Retry): 11:21:45 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:45 11:21:45 if release_conn is None: 11:21:45 release_conn = preload_content 11:21:45 11:21:45 # Check host 11:21:45 if assert_same_host and not self.is_same_host(url): 11:21:45 raise HostChangedError(self, url, retries) 11:21:45 11:21:45 # Ensure that the URL we're connecting to is properly encoded 11:21:45 if url.startswith("/"): 11:21:45 url = to_str(_encode_target(url)) 11:21:45 else: 11:21:45 url = to_str(parsed_url.url) 11:21:45 11:21:45 conn = None 11:21:45 11:21:45 # Track whether `conn` needs to be released before 11:21:45 # returning/raising/recursing. Update this variable if necessary, and 11:21:45 # leave `release_conn` constant throughout the function. That way, if 11:21:45 # the function recurses, the original value of `release_conn` will be 11:21:45 # passed down into the recursive call, and its value will be respected. 11:21:45 # 11:21:45 # See issue #651 [1] for details. 11:21:45 # 11:21:45 # [1] 11:21:45 release_this_conn = release_conn 11:21:45 11:21:45 http_tunnel_required = connection_requires_http_tunnel( 11:21:45 self.proxy, self.proxy_config, destination_scheme 11:21:45 ) 11:21:45 11:21:45 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:45 # have to copy the headers dict so we can safely change it without those 11:21:45 # changes being reflected in anyone else's copy. 11:21:45 if not http_tunnel_required: 11:21:45 headers = headers.copy() # type: ignore[attr-defined] 11:21:45 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:45 11:21:45 # Must keep the exception bound to a separate variable or else Python 3 11:21:45 # complains about UnboundLocalError. 11:21:45 err = None 11:21:45 11:21:45 # Keep track of whether we cleanly exited the except block. This 11:21:45 # ensures we do proper cleanup in finally. 11:21:45 clean_exit = False 11:21:45 11:21:45 # Rewind body position, if needed. Record current position 11:21:45 # for future rewinds in the event of a redirect/retry. 11:21:45 body_pos = set_file_position(body, body_pos) 11:21:45 11:21:45 try: 11:21:45 # Request a connection from the queue. 11:21:45 timeout_obj = self._get_timeout(timeout) 11:21:45 conn = self._get_conn(timeout=pool_timeout) 11:21:45 11:21:45 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:45 11:21:45 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:45 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:45 try: 11:21:45 self._prepare_proxy(conn) 11:21:45 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:45 self._raise_timeout( 11:21:45 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:45 ) 11:21:45 raise 11:21:45 11:21:45 # If we're going to release the connection in ``finally:``, then 11:21:45 # the response doesn't need to know about the connection. Otherwise 11:21:45 # it will also try to release it and we'll have a double-release 11:21:45 # mess. 11:21:45 response_conn = conn if not release_conn else None 11:21:45 11:21:45 # Make the request on the HTTPConnection object 11:21:45 > response = self._make_request( 11:21:45 conn, 11:21:45 method, 11:21:45 url, 11:21:45 timeout=timeout_obj, 11:21:45 body=body, 11:21:45 headers=headers, 11:21:45 chunked=chunked, 11:21:45 retries=retries, 11:21:45 response_conn=response_conn, 11:21:45 preload_content=preload_content, 11:21:45 decode_content=decode_content, 11:21:45 **response_kw, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:45 conn.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:45 self.endheaders() 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:45 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:45 self.send(msg) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:45 self.connect() 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:45 self.sock = self._new_conn() 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 except socket.gaierror as e: 11:21:45 raise NameResolutionError(self.host, self, e) from e 11:21:45 except SocketTimeout as e: 11:21:45 raise ConnectTimeoutError( 11:21:45 self, 11:21:45 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:45 ) from e 11:21:45 11:21:45 except OSError as e: 11:21:45 > raise NewConnectionError( 11:21:45 self, f"Failed to establish a new connection: {e}" 11:21:45 ) from e 11:21:45 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 > resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:45 retries = retries.increment( 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 method = 'POST' 11:21:45 url = '/rests/operations/transportpce-networkutils:init-xpdr-rdm-links' 11:21:45 response = None 11:21:45 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:45 _pool = 11:21:45 _stacktrace = 11:21:45 11:21:45 def increment( 11:21:45 self, 11:21:45 method: str | None = None, 11:21:45 url: str | None = None, 11:21:45 response: BaseHTTPResponse | None = None, 11:21:45 error: Exception | None = None, 11:21:45 _pool: ConnectionPool | None = None, 11:21:45 _stacktrace: TracebackType | None = None, 11:21:45 ) -> Self: 11:21:45 """Return a new Retry object with incremented retry counters. 11:21:45 11:21:45 :param response: A response object, or None, if the server did not 11:21:45 return a response. 11:21:45 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:45 :param Exception error: An error encountered during the request, or 11:21:45 None if the response was received successfully. 11:21:45 11:21:45 :return: A new ``Retry`` object. 11:21:45 """ 11:21:45 if self.total is False and error: 11:21:45 # Disabled, indicate to re-raise the error. 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 11:21:45 total = self.total 11:21:45 if total is not None: 11:21:45 total -= 1 11:21:45 11:21:45 connect = self.connect 11:21:45 read = self.read 11:21:45 redirect = self.redirect 11:21:45 status_count = self.status 11:21:45 other = self.other 11:21:45 cause = "unknown" 11:21:45 status = None 11:21:45 redirect_location = None 11:21:45 11:21:45 if error and self._is_connection_error(error): 11:21:45 # Connect retry? 11:21:45 if connect is False: 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif connect is not None: 11:21:45 connect -= 1 11:21:45 11:21:45 elif error and self._is_read_error(error): 11:21:45 # Read retry? 11:21:45 if read is False or method is None or not self._is_method_retryable(method): 11:21:45 raise reraise(type(error), error, _stacktrace) 11:21:45 elif read is not None: 11:21:45 read -= 1 11:21:45 11:21:45 elif error: 11:21:45 # Other retry? 11:21:45 if other is not None: 11:21:45 other -= 1 11:21:45 11:21:45 elif response and response.get_redirect_location(): 11:21:45 # Redirect retry? 11:21:45 if redirect is not None: 11:21:45 redirect -= 1 11:21:45 cause = "too many redirects" 11:21:45 response_redirect_location = response.get_redirect_location() 11:21:45 if response_redirect_location: 11:21:45 redirect_location = response_redirect_location 11:21:45 status = response.status 11:21:45 11:21:45 else: 11:21:45 # Incrementing because of a server error like a 500 in 11:21:45 # status_forcelist and the given method is in the allowed_methods 11:21:45 cause = ResponseError.GENERIC_ERROR 11:21:45 if response and response.status: 11:21:45 if status_count is not None: 11:21:45 status_count -= 1 11:21:45 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:45 status = response.status 11:21:45 11:21:45 history = self.history + ( 11:21:45 RequestHistory(method, url, error, status, redirect_location), 11:21:45 ) 11:21:45 11:21:45 new_retry = self.new( 11:21:45 total=total, 11:21:45 connect=connect, 11:21:45 read=read, 11:21:45 redirect=redirect, 11:21:45 status=status_count, 11:21:45 other=other, 11:21:45 history=history, 11:21:45 ) 11:21:45 11:21:45 if new_retry.is_exhausted(): 11:21:45 reason = error or ResponseError(cause) 11:21:45 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:45 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-xpdr-rdm-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:45 11:21:45 During handling of the above exception, another exception occurred: 11:21:45 11:21:45 self = 11:21:45 11:21:45 def test_18_connect_xpdrA_N2_to_roadmA_PP2(self): 11:21:45 > response = test_utils.transportpce_api_rpc_request( 11:21:45 'transportpce-networkutils', 'init-xpdr-rdm-links', 11:21:45 {'links-input': {'xpdr-node': 'XPDRA01', 'xpdr-num': '1', 'network-num': '2', 11:21:45 'rdm-node': 'ROADMA01', 'srg-num': '1', 'termination-point-num': 'SRG1-PP2-TXRX'}}) 11:21:45 11:21:45 transportpce_tests/1.2.1/test06_end2end.py:286: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 transportpce_tests/common/test_utils.py:687: in transportpce_api_rpc_request 11:21:45 response = post_request(url, data) 11:21:45 transportpce_tests/common/test_utils.py:142: in post_request 11:21:45 return requests.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:45 return session.request(method=method, url=url, **kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:45 resp = self.send(prep, **send_kwargs) 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:45 r = adapter.send(request, **kwargs) 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 except (ProtocolError, OSError) as err: 11:21:45 raise ConnectionError(err, request=request) 11:21:45 11:21:45 except MaxRetryError as e: 11:21:45 if isinstance(e.reason, ConnectTimeoutError): 11:21:45 # TODO: Remove this in 3.0.0: see #2811 11:21:45 if not isinstance(e.reason, NewConnectionError): 11:21:45 raise ConnectTimeout(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, ResponseError): 11:21:45 raise RetryError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _ProxyError): 11:21:45 raise ProxyError(e, request=request) 11:21:45 11:21:45 if isinstance(e.reason, _SSLError): 11:21:45 # This branch is for urllib3 v1.22 and later. 11:21:45 raise SSLError(e, request=request) 11:21:45 11:21:45 > raise ConnectionError(e, request=request) 11:21:45 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-xpdr-rdm-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:45 ----------------------------- Captured stdout call ----------------------------- 11:21:45 execution of test_18_connect_xpdrA_N2_to_roadmA_PP2 11:21:45 ________ TransportPCEFulltesting.test_19_connect_roadmA_PP2_to_xpdrA_N2 ________ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 > sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:45 raise err 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:45 socket_options = [(6, 1, 1)] 11:21:45 11:21:45 def create_connection( 11:21:45 address: tuple[str, int], 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 source_address: tuple[str, int] | None = None, 11:21:45 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:45 ) -> socket.socket: 11:21:45 """Connect to *address* and return the socket object. 11:21:45 11:21:45 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:45 port)``) and return the socket object. Passing the optional 11:21:45 *timeout* parameter will set the timeout on the socket instance 11:21:45 before attempting to connect. If no *timeout* is supplied, the 11:21:45 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:45 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:45 for the socket to bind as a source address before making the connection. 11:21:45 An host of '' or port 0 tells the OS to use the default. 11:21:45 """ 11:21:45 11:21:45 host, port = address 11:21:45 if host.startswith("["): 11:21:45 host = host.strip("[]") 11:21:45 err = None 11:21:45 11:21:45 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:45 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:45 # The original create_connection function always returns all records. 11:21:45 family = allowed_gai_family() 11:21:45 11:21:45 try: 11:21:45 host.encode("idna") 11:21:45 except UnicodeError: 11:21:45 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:45 11:21:45 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:45 af, socktype, proto, canonname, sa = res 11:21:45 sock = None 11:21:45 try: 11:21:45 sock = socket.socket(af, socktype, proto) 11:21:45 11:21:45 # If provided, set socket level options before connecting. 11:21:45 _set_socket_options(sock, socket_options) 11:21:45 11:21:45 if timeout is not _DEFAULT_TIMEOUT: 11:21:45 sock.settimeout(timeout) 11:21:45 if source_address: 11:21:45 sock.bind(source_address) 11:21:45 > sock.connect(sa) 11:21:45 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 method = 'POST' 11:21:45 url = '/rests/operations/transportpce-networkutils:init-rdm-xpdr-links' 11:21:45 body = '{"input": {"links-input": {"xpdr-node": "XPDRA01", "xpdr-num": "1", "network-num": "2", "rdm-node": "ROADMA01", "srg-num": "1", "termination-point-num": "SRG1-PP2-TXRX"}}}' 11:21:45 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '171', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:45 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 redirect = False, assert_same_host = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:45 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:45 decode_content = False, response_kw = {} 11:21:45 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-networkutils:init-rdm-xpdr-links', query=None, fragment=None) 11:21:45 destination_scheme = None, conn = None, release_this_conn = True 11:21:45 http_tunnel_required = False, err = None, clean_exit = False 11:21:45 11:21:45 def urlopen( # type: ignore[override] 11:21:45 self, 11:21:45 method: str, 11:21:45 url: str, 11:21:45 body: _TYPE_BODY | None = None, 11:21:45 headers: typing.Mapping[str, str] | None = None, 11:21:45 retries: Retry | bool | int | None = None, 11:21:45 redirect: bool = True, 11:21:45 assert_same_host: bool = True, 11:21:45 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:45 pool_timeout: int | None = None, 11:21:45 release_conn: bool | None = None, 11:21:45 chunked: bool = False, 11:21:45 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:45 preload_content: bool = True, 11:21:45 decode_content: bool = True, 11:21:45 **response_kw: typing.Any, 11:21:45 ) -> BaseHTTPResponse: 11:21:45 """ 11:21:45 Get a connection from the pool and perform an HTTP request. This is the 11:21:45 lowest level call for making a request, so you'll need to specify all 11:21:45 the raw details. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 More commonly, it's appropriate to use a convenience method 11:21:45 such as :meth:`request`. 11:21:45 11:21:45 .. note:: 11:21:45 11:21:45 `release_conn` will only behave as expected if 11:21:45 `preload_content=False` because we want to make 11:21:45 `preload_content=False` the default behaviour someday soon without 11:21:45 breaking backwards compatibility. 11:21:45 11:21:45 :param method: 11:21:45 HTTP request method (such as GET, POST, PUT, etc.) 11:21:45 11:21:45 :param url: 11:21:45 The URL to perform the request on. 11:21:45 11:21:45 :param body: 11:21:45 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:45 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:45 11:21:45 :param headers: 11:21:45 Dictionary of custom headers to send, such as User-Agent, 11:21:45 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:45 these headers completely replace any pool-specific headers. 11:21:45 11:21:45 :param retries: 11:21:45 Configure the number of retries to allow before raising a 11:21:45 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:45 11:21:45 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:45 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:45 over different types of retries. 11:21:45 Pass an integer number to retry connection errors that many times, 11:21:45 but no other types of errors. Pass zero to never retry. 11:21:45 11:21:45 If ``False``, then retries are disabled and any exception is raised 11:21:45 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:45 the redirect response will be returned. 11:21:45 11:21:45 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:45 11:21:45 :param redirect: 11:21:45 If True, automatically handle redirects (status codes 301, 302, 11:21:45 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:45 will disable redirect, too. 11:21:45 11:21:45 :param assert_same_host: 11:21:45 If ``True``, will make sure that the host of the pool requests is 11:21:45 consistent else will raise HostChangedError. When ``False``, you can 11:21:45 use the pool on an HTTP proxy and request foreign hosts. 11:21:45 11:21:45 :param timeout: 11:21:45 If specified, overrides the default timeout for this one 11:21:45 request. It may be a float (in seconds) or an instance of 11:21:45 :class:`urllib3.util.Timeout`. 11:21:45 11:21:45 :param pool_timeout: 11:21:45 If set and the pool is set to block=True, then this method will 11:21:45 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:45 connection is available within the time period. 11:21:45 11:21:45 :param bool preload_content: 11:21:45 If True, the response's body will be preloaded into memory. 11:21:45 11:21:45 :param bool decode_content: 11:21:45 If True, will attempt to decode the body based on the 11:21:45 'content-encoding' header. 11:21:45 11:21:45 :param release_conn: 11:21:45 If False, then the urlopen call will not release the connection 11:21:45 back into the pool once a response is received (but will release if 11:21:45 you read the entire contents of the response such as when 11:21:45 `preload_content=True`). This is useful if you're not preloading 11:21:45 the response's content immediately. You will need to call 11:21:45 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:45 back into the pool. If None, it takes the value of ``preload_content`` 11:21:45 which defaults to ``True``. 11:21:45 11:21:45 :param bool chunked: 11:21:45 If True, urllib3 will send the body using chunked transfer 11:21:45 encoding. Otherwise, urllib3 will send the body using the standard 11:21:45 content-length form. Defaults to False. 11:21:45 11:21:45 :param int body_pos: 11:21:45 Position to seek to in file-like body in the event of a retry or 11:21:45 redirect. Typically this won't need to be set because urllib3 will 11:21:45 auto-populate the value when needed. 11:21:45 """ 11:21:45 parsed_url = parse_url(url) 11:21:45 destination_scheme = parsed_url.scheme 11:21:45 11:21:45 if headers is None: 11:21:45 headers = self.headers 11:21:45 11:21:45 if not isinstance(retries, Retry): 11:21:45 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:45 11:21:45 if release_conn is None: 11:21:45 release_conn = preload_content 11:21:45 11:21:45 # Check host 11:21:45 if assert_same_host and not self.is_same_host(url): 11:21:45 raise HostChangedError(self, url, retries) 11:21:45 11:21:45 # Ensure that the URL we're connecting to is properly encoded 11:21:45 if url.startswith("/"): 11:21:45 url = to_str(_encode_target(url)) 11:21:45 else: 11:21:45 url = to_str(parsed_url.url) 11:21:45 11:21:45 conn = None 11:21:45 11:21:45 # Track whether `conn` needs to be released before 11:21:45 # returning/raising/recursing. Update this variable if necessary, and 11:21:45 # leave `release_conn` constant throughout the function. That way, if 11:21:45 # the function recurses, the original value of `release_conn` will be 11:21:45 # passed down into the recursive call, and its value will be respected. 11:21:45 # 11:21:45 # See issue #651 [1] for details. 11:21:45 # 11:21:45 # [1] 11:21:45 release_this_conn = release_conn 11:21:45 11:21:45 http_tunnel_required = connection_requires_http_tunnel( 11:21:45 self.proxy, self.proxy_config, destination_scheme 11:21:45 ) 11:21:45 11:21:45 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:45 # have to copy the headers dict so we can safely change it without those 11:21:45 # changes being reflected in anyone else's copy. 11:21:45 if not http_tunnel_required: 11:21:45 headers = headers.copy() # type: ignore[attr-defined] 11:21:45 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:45 11:21:45 # Must keep the exception bound to a separate variable or else Python 3 11:21:45 # complains about UnboundLocalError. 11:21:45 err = None 11:21:45 11:21:45 # Keep track of whether we cleanly exited the except block. This 11:21:45 # ensures we do proper cleanup in finally. 11:21:45 clean_exit = False 11:21:45 11:21:45 # Rewind body position, if needed. Record current position 11:21:45 # for future rewinds in the event of a redirect/retry. 11:21:45 body_pos = set_file_position(body, body_pos) 11:21:45 11:21:45 try: 11:21:45 # Request a connection from the queue. 11:21:45 timeout_obj = self._get_timeout(timeout) 11:21:45 conn = self._get_conn(timeout=pool_timeout) 11:21:45 11:21:45 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:45 11:21:45 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:45 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:45 try: 11:21:45 self._prepare_proxy(conn) 11:21:45 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:45 self._raise_timeout( 11:21:45 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:45 ) 11:21:45 raise 11:21:45 11:21:45 # If we're going to release the connection in ``finally:``, then 11:21:45 # the response doesn't need to know about the connection. Otherwise 11:21:45 # it will also try to release it and we'll have a double-release 11:21:45 # mess. 11:21:45 response_conn = conn if not release_conn else None 11:21:45 11:21:45 # Make the request on the HTTPConnection object 11:21:45 > response = self._make_request( 11:21:45 conn, 11:21:45 method, 11:21:45 url, 11:21:45 timeout=timeout_obj, 11:21:45 body=body, 11:21:45 headers=headers, 11:21:45 chunked=chunked, 11:21:45 retries=retries, 11:21:45 response_conn=response_conn, 11:21:45 preload_content=preload_content, 11:21:45 decode_content=decode_content, 11:21:45 **response_kw, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:45 conn.request( 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:45 self.endheaders() 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:45 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:45 self.send(msg) 11:21:45 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:45 self.connect() 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:45 self.sock = self._new_conn() 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = 11:21:45 11:21:45 def _new_conn(self) -> socket.socket: 11:21:45 """Establish a socket connection and set nodelay settings on it. 11:21:45 11:21:45 :return: New socket connection. 11:21:45 """ 11:21:45 try: 11:21:45 sock = connection.create_connection( 11:21:45 (self._dns_host, self.port), 11:21:45 self.timeout, 11:21:45 source_address=self.source_address, 11:21:45 socket_options=self.socket_options, 11:21:45 ) 11:21:45 except socket.gaierror as e: 11:21:45 raise NameResolutionError(self.host, self, e) from e 11:21:45 except SocketTimeout as e: 11:21:45 raise ConnectTimeoutError( 11:21:45 self, 11:21:45 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:45 ) from e 11:21:45 11:21:45 except OSError as e: 11:21:45 > raise NewConnectionError( 11:21:45 self, f"Failed to establish a new connection: {e}" 11:21:45 ) from e 11:21:45 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:45 11:21:45 The above exception was the direct cause of the following exception: 11:21:45 11:21:45 self = 11:21:45 request = , stream = False 11:21:45 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:45 proxies = OrderedDict() 11:21:45 11:21:45 def send( 11:21:45 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:45 ): 11:21:45 """Sends PreparedRequest object. Returns Response object. 11:21:45 11:21:45 :param request: The :class:`PreparedRequest ` being sent. 11:21:45 :param stream: (optional) Whether to stream the request content. 11:21:45 :param timeout: (optional) How long to wait for the server to send 11:21:45 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:45 read timeout) ` tuple. 11:21:45 :type timeout: float or tuple or urllib3 Timeout object 11:21:45 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:45 we verify the server's TLS certificate, or a string, in which case it 11:21:45 must be a path to a CA bundle to use 11:21:45 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:45 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:45 :rtype: requests.Response 11:21:45 """ 11:21:45 11:21:45 try: 11:21:45 conn = self.get_connection_with_tls_context( 11:21:45 request, verify, proxies=proxies, cert=cert 11:21:45 ) 11:21:45 except LocationValueError as e: 11:21:45 raise InvalidURL(e, request=request) 11:21:45 11:21:45 self.cert_verify(conn, request.url, verify, cert) 11:21:45 url = self.request_url(request, proxies) 11:21:45 self.add_headers( 11:21:45 request, 11:21:45 stream=stream, 11:21:45 timeout=timeout, 11:21:45 verify=verify, 11:21:45 cert=cert, 11:21:45 proxies=proxies, 11:21:45 ) 11:21:45 11:21:45 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:45 11:21:45 if isinstance(timeout, tuple): 11:21:45 try: 11:21:45 connect, read = timeout 11:21:45 timeout = TimeoutSauce(connect=connect, read=read) 11:21:45 except ValueError: 11:21:45 raise ValueError( 11:21:45 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:45 f"or a single float to set both timeouts to the same value." 11:21:45 ) 11:21:45 elif isinstance(timeout, TimeoutSauce): 11:21:45 pass 11:21:45 else: 11:21:45 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:45 11:21:45 try: 11:21:45 > resp = conn.urlopen( 11:21:45 method=request.method, 11:21:45 url=url, 11:21:45 body=request.body, 11:21:45 headers=request.headers, 11:21:45 redirect=False, 11:21:45 assert_same_host=False, 11:21:45 preload_content=False, 11:21:45 decode_content=False, 11:21:45 retries=self.max_retries, 11:21:45 timeout=timeout, 11:21:45 chunked=chunked, 11:21:45 ) 11:21:45 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:45 retries = retries.increment( 11:21:45 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:45 11:21:45 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:45 method = 'POST' 11:21:45 url = '/rests/operations/transportpce-networkutils:init-rdm-xpdr-links' 11:21:45 response = None 11:21:45 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:45 _pool = 11:21:45 _stacktrace = 11:21:45 11:21:45 def increment( 11:21:45 self, 11:21:45 method: str | None = None, 11:21:45 url: str | None = None, 11:21:45 response: BaseHTTPResponse | None = None, 11:21:45 error: Exception | None = None, 11:21:45 _pool: ConnectionPool | None = None, 11:21:45 _stacktrace: TracebackType | None = None, 11:21:45 ) -> Self: 11:21:45 """Return a new Retry object with incremented retry counters. 11:21:45 11:21:45 :param response: A response object, or None, if the server did not 11:21:45 return a response. 11:21:45 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:46 :param Exception error: An error encountered during the request, or 11:21:46 None if the response was received successfully. 11:21:46 11:21:46 :return: A new ``Retry`` object. 11:21:46 """ 11:21:46 if self.total is False and error: 11:21:46 # Disabled, indicate to re-raise the error. 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 11:21:46 total = self.total 11:21:46 if total is not None: 11:21:46 total -= 1 11:21:46 11:21:46 connect = self.connect 11:21:46 read = self.read 11:21:46 redirect = self.redirect 11:21:46 status_count = self.status 11:21:46 other = self.other 11:21:46 cause = "unknown" 11:21:46 status = None 11:21:46 redirect_location = None 11:21:46 11:21:46 if error and self._is_connection_error(error): 11:21:46 # Connect retry? 11:21:46 if connect is False: 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif connect is not None: 11:21:46 connect -= 1 11:21:46 11:21:46 elif error and self._is_read_error(error): 11:21:46 # Read retry? 11:21:46 if read is False or method is None or not self._is_method_retryable(method): 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif read is not None: 11:21:46 read -= 1 11:21:46 11:21:46 elif error: 11:21:46 # Other retry? 11:21:46 if other is not None: 11:21:46 other -= 1 11:21:46 11:21:46 elif response and response.get_redirect_location(): 11:21:46 # Redirect retry? 11:21:46 if redirect is not None: 11:21:46 redirect -= 1 11:21:46 cause = "too many redirects" 11:21:46 response_redirect_location = response.get_redirect_location() 11:21:46 if response_redirect_location: 11:21:46 redirect_location = response_redirect_location 11:21:46 status = response.status 11:21:46 11:21:46 else: 11:21:46 # Incrementing because of a server error like a 500 in 11:21:46 # status_forcelist and the given method is in the allowed_methods 11:21:46 cause = ResponseError.GENERIC_ERROR 11:21:46 if response and response.status: 11:21:46 if status_count is not None: 11:21:46 status_count -= 1 11:21:46 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:46 status = response.status 11:21:46 11:21:46 history = self.history + ( 11:21:46 RequestHistory(method, url, error, status, redirect_location), 11:21:46 ) 11:21:46 11:21:46 new_retry = self.new( 11:21:46 total=total, 11:21:46 connect=connect, 11:21:46 read=read, 11:21:46 redirect=redirect, 11:21:46 status=status_count, 11:21:46 other=other, 11:21:46 history=history, 11:21:46 ) 11:21:46 11:21:46 if new_retry.is_exhausted(): 11:21:46 reason = error or ResponseError(cause) 11:21:46 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:46 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-rdm-xpdr-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_19_connect_roadmA_PP2_to_xpdrA_N2(self): 11:21:46 > response = test_utils.transportpce_api_rpc_request( 11:21:46 'transportpce-networkutils', 'init-rdm-xpdr-links', 11:21:46 {'links-input': {'xpdr-node': 'XPDRA01', 'xpdr-num': '1', 'network-num': '2', 11:21:46 'rdm-node': 'ROADMA01', 'srg-num': '1', 'termination-point-num': 'SRG1-PP2-TXRX'}}) 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:295: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:687: in transportpce_api_rpc_request 11:21:46 response = post_request(url, data) 11:21:46 transportpce_tests/common/test_utils.py:142: in post_request 11:21:46 return requests.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:46 return session.request(method=method, url=url, **kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:46 resp = self.send(prep, **send_kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:46 r = adapter.send(request, **kwargs) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 except (ProtocolError, OSError) as err: 11:21:46 raise ConnectionError(err, request=request) 11:21:46 11:21:46 except MaxRetryError as e: 11:21:46 if isinstance(e.reason, ConnectTimeoutError): 11:21:46 # TODO: Remove this in 3.0.0: see #2811 11:21:46 if not isinstance(e.reason, NewConnectionError): 11:21:46 raise ConnectTimeout(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, ResponseError): 11:21:46 raise RetryError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _ProxyError): 11:21:46 raise ProxyError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _SSLError): 11:21:46 # This branch is for urllib3 v1.22 and later. 11:21:46 raise SSLError(e, request=request) 11:21:46 11:21:46 > raise ConnectionError(e, request=request) 11:21:46 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-rdm-xpdr-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_19_connect_roadmA_PP2_to_xpdrA_N2 11:21:46 ________ TransportPCEFulltesting.test_20_connect_xpdrC_N2_to_roadmC_PP2 ________ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 > sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:46 raise err 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:46 socket_options = [(6, 1, 1)] 11:21:46 11:21:46 def create_connection( 11:21:46 address: tuple[str, int], 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 source_address: tuple[str, int] | None = None, 11:21:46 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:46 ) -> socket.socket: 11:21:46 """Connect to *address* and return the socket object. 11:21:46 11:21:46 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:46 port)``) and return the socket object. Passing the optional 11:21:46 *timeout* parameter will set the timeout on the socket instance 11:21:46 before attempting to connect. If no *timeout* is supplied, the 11:21:46 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:46 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:46 for the socket to bind as a source address before making the connection. 11:21:46 An host of '' or port 0 tells the OS to use the default. 11:21:46 """ 11:21:46 11:21:46 host, port = address 11:21:46 if host.startswith("["): 11:21:46 host = host.strip("[]") 11:21:46 err = None 11:21:46 11:21:46 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:46 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:46 # The original create_connection function always returns all records. 11:21:46 family = allowed_gai_family() 11:21:46 11:21:46 try: 11:21:46 host.encode("idna") 11:21:46 except UnicodeError: 11:21:46 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:46 11:21:46 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:46 af, socktype, proto, canonname, sa = res 11:21:46 sock = None 11:21:46 try: 11:21:46 sock = socket.socket(af, socktype, proto) 11:21:46 11:21:46 # If provided, set socket level options before connecting. 11:21:46 _set_socket_options(sock, socket_options) 11:21:46 11:21:46 if timeout is not _DEFAULT_TIMEOUT: 11:21:46 sock.settimeout(timeout) 11:21:46 if source_address: 11:21:46 sock.bind(source_address) 11:21:46 > sock.connect(sa) 11:21:46 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 method = 'POST' 11:21:46 url = '/rests/operations/transportpce-networkutils:init-xpdr-rdm-links' 11:21:46 body = '{"input": {"links-input": {"xpdr-node": "XPDRC01", "xpdr-num": "1", "network-num": "2", "rdm-node": "ROADMC01", "srg-num": "1", "termination-point-num": "SRG1-PP2-TXRX"}}}' 11:21:46 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '171', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:46 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 redirect = False, assert_same_host = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:46 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:46 decode_content = False, response_kw = {} 11:21:46 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-networkutils:init-xpdr-rdm-links', query=None, fragment=None) 11:21:46 destination_scheme = None, conn = None, release_this_conn = True 11:21:46 http_tunnel_required = False, err = None, clean_exit = False 11:21:46 11:21:46 def urlopen( # type: ignore[override] 11:21:46 self, 11:21:46 method: str, 11:21:46 url: str, 11:21:46 body: _TYPE_BODY | None = None, 11:21:46 headers: typing.Mapping[str, str] | None = None, 11:21:46 retries: Retry | bool | int | None = None, 11:21:46 redirect: bool = True, 11:21:46 assert_same_host: bool = True, 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 pool_timeout: int | None = None, 11:21:46 release_conn: bool | None = None, 11:21:46 chunked: bool = False, 11:21:46 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:46 preload_content: bool = True, 11:21:46 decode_content: bool = True, 11:21:46 **response_kw: typing.Any, 11:21:46 ) -> BaseHTTPResponse: 11:21:46 """ 11:21:46 Get a connection from the pool and perform an HTTP request. This is the 11:21:46 lowest level call for making a request, so you'll need to specify all 11:21:46 the raw details. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 More commonly, it's appropriate to use a convenience method 11:21:46 such as :meth:`request`. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 `release_conn` will only behave as expected if 11:21:46 `preload_content=False` because we want to make 11:21:46 `preload_content=False` the default behaviour someday soon without 11:21:46 breaking backwards compatibility. 11:21:46 11:21:46 :param method: 11:21:46 HTTP request method (such as GET, POST, PUT, etc.) 11:21:46 11:21:46 :param url: 11:21:46 The URL to perform the request on. 11:21:46 11:21:46 :param body: 11:21:46 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:46 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:46 11:21:46 :param headers: 11:21:46 Dictionary of custom headers to send, such as User-Agent, 11:21:46 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:46 these headers completely replace any pool-specific headers. 11:21:46 11:21:46 :param retries: 11:21:46 Configure the number of retries to allow before raising a 11:21:46 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:46 11:21:46 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:46 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:46 over different types of retries. 11:21:46 Pass an integer number to retry connection errors that many times, 11:21:46 but no other types of errors. Pass zero to never retry. 11:21:46 11:21:46 If ``False``, then retries are disabled and any exception is raised 11:21:46 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:46 the redirect response will be returned. 11:21:46 11:21:46 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:46 11:21:46 :param redirect: 11:21:46 If True, automatically handle redirects (status codes 301, 302, 11:21:46 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:46 will disable redirect, too. 11:21:46 11:21:46 :param assert_same_host: 11:21:46 If ``True``, will make sure that the host of the pool requests is 11:21:46 consistent else will raise HostChangedError. When ``False``, you can 11:21:46 use the pool on an HTTP proxy and request foreign hosts. 11:21:46 11:21:46 :param timeout: 11:21:46 If specified, overrides the default timeout for this one 11:21:46 request. It may be a float (in seconds) or an instance of 11:21:46 :class:`urllib3.util.Timeout`. 11:21:46 11:21:46 :param pool_timeout: 11:21:46 If set and the pool is set to block=True, then this method will 11:21:46 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:46 connection is available within the time period. 11:21:46 11:21:46 :param bool preload_content: 11:21:46 If True, the response's body will be preloaded into memory. 11:21:46 11:21:46 :param bool decode_content: 11:21:46 If True, will attempt to decode the body based on the 11:21:46 'content-encoding' header. 11:21:46 11:21:46 :param release_conn: 11:21:46 If False, then the urlopen call will not release the connection 11:21:46 back into the pool once a response is received (but will release if 11:21:46 you read the entire contents of the response such as when 11:21:46 `preload_content=True`). This is useful if you're not preloading 11:21:46 the response's content immediately. You will need to call 11:21:46 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:46 back into the pool. If None, it takes the value of ``preload_content`` 11:21:46 which defaults to ``True``. 11:21:46 11:21:46 :param bool chunked: 11:21:46 If True, urllib3 will send the body using chunked transfer 11:21:46 encoding. Otherwise, urllib3 will send the body using the standard 11:21:46 content-length form. Defaults to False. 11:21:46 11:21:46 :param int body_pos: 11:21:46 Position to seek to in file-like body in the event of a retry or 11:21:46 redirect. Typically this won't need to be set because urllib3 will 11:21:46 auto-populate the value when needed. 11:21:46 """ 11:21:46 parsed_url = parse_url(url) 11:21:46 destination_scheme = parsed_url.scheme 11:21:46 11:21:46 if headers is None: 11:21:46 headers = self.headers 11:21:46 11:21:46 if not isinstance(retries, Retry): 11:21:46 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:46 11:21:46 if release_conn is None: 11:21:46 release_conn = preload_content 11:21:46 11:21:46 # Check host 11:21:46 if assert_same_host and not self.is_same_host(url): 11:21:46 raise HostChangedError(self, url, retries) 11:21:46 11:21:46 # Ensure that the URL we're connecting to is properly encoded 11:21:46 if url.startswith("/"): 11:21:46 url = to_str(_encode_target(url)) 11:21:46 else: 11:21:46 url = to_str(parsed_url.url) 11:21:46 11:21:46 conn = None 11:21:46 11:21:46 # Track whether `conn` needs to be released before 11:21:46 # returning/raising/recursing. Update this variable if necessary, and 11:21:46 # leave `release_conn` constant throughout the function. That way, if 11:21:46 # the function recurses, the original value of `release_conn` will be 11:21:46 # passed down into the recursive call, and its value will be respected. 11:21:46 # 11:21:46 # See issue #651 [1] for details. 11:21:46 # 11:21:46 # [1] 11:21:46 release_this_conn = release_conn 11:21:46 11:21:46 http_tunnel_required = connection_requires_http_tunnel( 11:21:46 self.proxy, self.proxy_config, destination_scheme 11:21:46 ) 11:21:46 11:21:46 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:46 # have to copy the headers dict so we can safely change it without those 11:21:46 # changes being reflected in anyone else's copy. 11:21:46 if not http_tunnel_required: 11:21:46 headers = headers.copy() # type: ignore[attr-defined] 11:21:46 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:46 11:21:46 # Must keep the exception bound to a separate variable or else Python 3 11:21:46 # complains about UnboundLocalError. 11:21:46 err = None 11:21:46 11:21:46 # Keep track of whether we cleanly exited the except block. This 11:21:46 # ensures we do proper cleanup in finally. 11:21:46 clean_exit = False 11:21:46 11:21:46 # Rewind body position, if needed. Record current position 11:21:46 # for future rewinds in the event of a redirect/retry. 11:21:46 body_pos = set_file_position(body, body_pos) 11:21:46 11:21:46 try: 11:21:46 # Request a connection from the queue. 11:21:46 timeout_obj = self._get_timeout(timeout) 11:21:46 conn = self._get_conn(timeout=pool_timeout) 11:21:46 11:21:46 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:46 11:21:46 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:46 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:46 try: 11:21:46 self._prepare_proxy(conn) 11:21:46 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:46 self._raise_timeout( 11:21:46 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:46 ) 11:21:46 raise 11:21:46 11:21:46 # If we're going to release the connection in ``finally:``, then 11:21:46 # the response doesn't need to know about the connection. Otherwise 11:21:46 # it will also try to release it and we'll have a double-release 11:21:46 # mess. 11:21:46 response_conn = conn if not release_conn else None 11:21:46 11:21:46 # Make the request on the HTTPConnection object 11:21:46 > response = self._make_request( 11:21:46 conn, 11:21:46 method, 11:21:46 url, 11:21:46 timeout=timeout_obj, 11:21:46 body=body, 11:21:46 headers=headers, 11:21:46 chunked=chunked, 11:21:46 retries=retries, 11:21:46 response_conn=response_conn, 11:21:46 preload_content=preload_content, 11:21:46 decode_content=decode_content, 11:21:46 **response_kw, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:46 conn.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:46 self.endheaders() 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:46 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:46 self.send(msg) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:46 self.connect() 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:46 self.sock = self._new_conn() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 except socket.gaierror as e: 11:21:46 raise NameResolutionError(self.host, self, e) from e 11:21:46 except SocketTimeout as e: 11:21:46 raise ConnectTimeoutError( 11:21:46 self, 11:21:46 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:46 ) from e 11:21:46 11:21:46 except OSError as e: 11:21:46 > raise NewConnectionError( 11:21:46 self, f"Failed to establish a new connection: {e}" 11:21:46 ) from e 11:21:46 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 > resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:46 retries = retries.increment( 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 method = 'POST' 11:21:46 url = '/rests/operations/transportpce-networkutils:init-xpdr-rdm-links' 11:21:46 response = None 11:21:46 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:46 _pool = 11:21:46 _stacktrace = 11:21:46 11:21:46 def increment( 11:21:46 self, 11:21:46 method: str | None = None, 11:21:46 url: str | None = None, 11:21:46 response: BaseHTTPResponse | None = None, 11:21:46 error: Exception | None = None, 11:21:46 _pool: ConnectionPool | None = None, 11:21:46 _stacktrace: TracebackType | None = None, 11:21:46 ) -> Self: 11:21:46 """Return a new Retry object with incremented retry counters. 11:21:46 11:21:46 :param response: A response object, or None, if the server did not 11:21:46 return a response. 11:21:46 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:46 :param Exception error: An error encountered during the request, or 11:21:46 None if the response was received successfully. 11:21:46 11:21:46 :return: A new ``Retry`` object. 11:21:46 """ 11:21:46 if self.total is False and error: 11:21:46 # Disabled, indicate to re-raise the error. 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 11:21:46 total = self.total 11:21:46 if total is not None: 11:21:46 total -= 1 11:21:46 11:21:46 connect = self.connect 11:21:46 read = self.read 11:21:46 redirect = self.redirect 11:21:46 status_count = self.status 11:21:46 other = self.other 11:21:46 cause = "unknown" 11:21:46 status = None 11:21:46 redirect_location = None 11:21:46 11:21:46 if error and self._is_connection_error(error): 11:21:46 # Connect retry? 11:21:46 if connect is False: 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif connect is not None: 11:21:46 connect -= 1 11:21:46 11:21:46 elif error and self._is_read_error(error): 11:21:46 # Read retry? 11:21:46 if read is False or method is None or not self._is_method_retryable(method): 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif read is not None: 11:21:46 read -= 1 11:21:46 11:21:46 elif error: 11:21:46 # Other retry? 11:21:46 if other is not None: 11:21:46 other -= 1 11:21:46 11:21:46 elif response and response.get_redirect_location(): 11:21:46 # Redirect retry? 11:21:46 if redirect is not None: 11:21:46 redirect -= 1 11:21:46 cause = "too many redirects" 11:21:46 response_redirect_location = response.get_redirect_location() 11:21:46 if response_redirect_location: 11:21:46 redirect_location = response_redirect_location 11:21:46 status = response.status 11:21:46 11:21:46 else: 11:21:46 # Incrementing because of a server error like a 500 in 11:21:46 # status_forcelist and the given method is in the allowed_methods 11:21:46 cause = ResponseError.GENERIC_ERROR 11:21:46 if response and response.status: 11:21:46 if status_count is not None: 11:21:46 status_count -= 1 11:21:46 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:46 status = response.status 11:21:46 11:21:46 history = self.history + ( 11:21:46 RequestHistory(method, url, error, status, redirect_location), 11:21:46 ) 11:21:46 11:21:46 new_retry = self.new( 11:21:46 total=total, 11:21:46 connect=connect, 11:21:46 read=read, 11:21:46 redirect=redirect, 11:21:46 status=status_count, 11:21:46 other=other, 11:21:46 history=history, 11:21:46 ) 11:21:46 11:21:46 if new_retry.is_exhausted(): 11:21:46 reason = error or ResponseError(cause) 11:21:46 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:46 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-xpdr-rdm-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_20_connect_xpdrC_N2_to_roadmC_PP2(self): 11:21:46 > response = test_utils.transportpce_api_rpc_request( 11:21:46 'transportpce-networkutils', 'init-xpdr-rdm-links', 11:21:46 {'links-input': {'xpdr-node': 'XPDRC01', 'xpdr-num': '1', 'network-num': '2', 11:21:46 'rdm-node': 'ROADMC01', 'srg-num': '1', 'termination-point-num': 'SRG1-PP2-TXRX'}}) 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:304: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:687: in transportpce_api_rpc_request 11:21:46 response = post_request(url, data) 11:21:46 transportpce_tests/common/test_utils.py:142: in post_request 11:21:46 return requests.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:46 return session.request(method=method, url=url, **kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:46 resp = self.send(prep, **send_kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:46 r = adapter.send(request, **kwargs) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 except (ProtocolError, OSError) as err: 11:21:46 raise ConnectionError(err, request=request) 11:21:46 11:21:46 except MaxRetryError as e: 11:21:46 if isinstance(e.reason, ConnectTimeoutError): 11:21:46 # TODO: Remove this in 3.0.0: see #2811 11:21:46 if not isinstance(e.reason, NewConnectionError): 11:21:46 raise ConnectTimeout(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, ResponseError): 11:21:46 raise RetryError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _ProxyError): 11:21:46 raise ProxyError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _SSLError): 11:21:46 # This branch is for urllib3 v1.22 and later. 11:21:46 raise SSLError(e, request=request) 11:21:46 11:21:46 > raise ConnectionError(e, request=request) 11:21:46 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-xpdr-rdm-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_20_connect_xpdrC_N2_to_roadmC_PP2 11:21:46 ________ TransportPCEFulltesting.test_21_connect_roadmC_PP2_to_xpdrC_N2 ________ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 > sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:46 raise err 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:46 socket_options = [(6, 1, 1)] 11:21:46 11:21:46 def create_connection( 11:21:46 address: tuple[str, int], 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 source_address: tuple[str, int] | None = None, 11:21:46 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:46 ) -> socket.socket: 11:21:46 """Connect to *address* and return the socket object. 11:21:46 11:21:46 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:46 port)``) and return the socket object. Passing the optional 11:21:46 *timeout* parameter will set the timeout on the socket instance 11:21:46 before attempting to connect. If no *timeout* is supplied, the 11:21:46 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:46 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:46 for the socket to bind as a source address before making the connection. 11:21:46 An host of '' or port 0 tells the OS to use the default. 11:21:46 """ 11:21:46 11:21:46 host, port = address 11:21:46 if host.startswith("["): 11:21:46 host = host.strip("[]") 11:21:46 err = None 11:21:46 11:21:46 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:46 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:46 # The original create_connection function always returns all records. 11:21:46 family = allowed_gai_family() 11:21:46 11:21:46 try: 11:21:46 host.encode("idna") 11:21:46 except UnicodeError: 11:21:46 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:46 11:21:46 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:46 af, socktype, proto, canonname, sa = res 11:21:46 sock = None 11:21:46 try: 11:21:46 sock = socket.socket(af, socktype, proto) 11:21:46 11:21:46 # If provided, set socket level options before connecting. 11:21:46 _set_socket_options(sock, socket_options) 11:21:46 11:21:46 if timeout is not _DEFAULT_TIMEOUT: 11:21:46 sock.settimeout(timeout) 11:21:46 if source_address: 11:21:46 sock.bind(source_address) 11:21:46 > sock.connect(sa) 11:21:46 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 method = 'POST' 11:21:46 url = '/rests/operations/transportpce-networkutils:init-rdm-xpdr-links' 11:21:46 body = '{"input": {"links-input": {"xpdr-node": "XPDRC01", "xpdr-num": "1", "network-num": "2", "rdm-node": "ROADMC01", "srg-num": "1", "termination-point-num": "SRG1-PP2-TXRX"}}}' 11:21:46 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '171', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:46 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 redirect = False, assert_same_host = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:46 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:46 decode_content = False, response_kw = {} 11:21:46 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-networkutils:init-rdm-xpdr-links', query=None, fragment=None) 11:21:46 destination_scheme = None, conn = None, release_this_conn = True 11:21:46 http_tunnel_required = False, err = None, clean_exit = False 11:21:46 11:21:46 def urlopen( # type: ignore[override] 11:21:46 self, 11:21:46 method: str, 11:21:46 url: str, 11:21:46 body: _TYPE_BODY | None = None, 11:21:46 headers: typing.Mapping[str, str] | None = None, 11:21:46 retries: Retry | bool | int | None = None, 11:21:46 redirect: bool = True, 11:21:46 assert_same_host: bool = True, 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 pool_timeout: int | None = None, 11:21:46 release_conn: bool | None = None, 11:21:46 chunked: bool = False, 11:21:46 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:46 preload_content: bool = True, 11:21:46 decode_content: bool = True, 11:21:46 **response_kw: typing.Any, 11:21:46 ) -> BaseHTTPResponse: 11:21:46 """ 11:21:46 Get a connection from the pool and perform an HTTP request. This is the 11:21:46 lowest level call for making a request, so you'll need to specify all 11:21:46 the raw details. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 More commonly, it's appropriate to use a convenience method 11:21:46 such as :meth:`request`. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 `release_conn` will only behave as expected if 11:21:46 `preload_content=False` because we want to make 11:21:46 `preload_content=False` the default behaviour someday soon without 11:21:46 breaking backwards compatibility. 11:21:46 11:21:46 :param method: 11:21:46 HTTP request method (such as GET, POST, PUT, etc.) 11:21:46 11:21:46 :param url: 11:21:46 The URL to perform the request on. 11:21:46 11:21:46 :param body: 11:21:46 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:46 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:46 11:21:46 :param headers: 11:21:46 Dictionary of custom headers to send, such as User-Agent, 11:21:46 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:46 these headers completely replace any pool-specific headers. 11:21:46 11:21:46 :param retries: 11:21:46 Configure the number of retries to allow before raising a 11:21:46 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:46 11:21:46 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:46 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:46 over different types of retries. 11:21:46 Pass an integer number to retry connection errors that many times, 11:21:46 but no other types of errors. Pass zero to never retry. 11:21:46 11:21:46 If ``False``, then retries are disabled and any exception is raised 11:21:46 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:46 the redirect response will be returned. 11:21:46 11:21:46 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:46 11:21:46 :param redirect: 11:21:46 If True, automatically handle redirects (status codes 301, 302, 11:21:46 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:46 will disable redirect, too. 11:21:46 11:21:46 :param assert_same_host: 11:21:46 If ``True``, will make sure that the host of the pool requests is 11:21:46 consistent else will raise HostChangedError. When ``False``, you can 11:21:46 use the pool on an HTTP proxy and request foreign hosts. 11:21:46 11:21:46 :param timeout: 11:21:46 If specified, overrides the default timeout for this one 11:21:46 request. It may be a float (in seconds) or an instance of 11:21:46 :class:`urllib3.util.Timeout`. 11:21:46 11:21:46 :param pool_timeout: 11:21:46 If set and the pool is set to block=True, then this method will 11:21:46 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:46 connection is available within the time period. 11:21:46 11:21:46 :param bool preload_content: 11:21:46 If True, the response's body will be preloaded into memory. 11:21:46 11:21:46 :param bool decode_content: 11:21:46 If True, will attempt to decode the body based on the 11:21:46 'content-encoding' header. 11:21:46 11:21:46 :param release_conn: 11:21:46 If False, then the urlopen call will not release the connection 11:21:46 back into the pool once a response is received (but will release if 11:21:46 you read the entire contents of the response such as when 11:21:46 `preload_content=True`). This is useful if you're not preloading 11:21:46 the response's content immediately. You will need to call 11:21:46 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:46 back into the pool. If None, it takes the value of ``preload_content`` 11:21:46 which defaults to ``True``. 11:21:46 11:21:46 :param bool chunked: 11:21:46 If True, urllib3 will send the body using chunked transfer 11:21:46 encoding. Otherwise, urllib3 will send the body using the standard 11:21:46 content-length form. Defaults to False. 11:21:46 11:21:46 :param int body_pos: 11:21:46 Position to seek to in file-like body in the event of a retry or 11:21:46 redirect. Typically this won't need to be set because urllib3 will 11:21:46 auto-populate the value when needed. 11:21:46 """ 11:21:46 parsed_url = parse_url(url) 11:21:46 destination_scheme = parsed_url.scheme 11:21:46 11:21:46 if headers is None: 11:21:46 headers = self.headers 11:21:46 11:21:46 if not isinstance(retries, Retry): 11:21:46 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:46 11:21:46 if release_conn is None: 11:21:46 release_conn = preload_content 11:21:46 11:21:46 # Check host 11:21:46 if assert_same_host and not self.is_same_host(url): 11:21:46 raise HostChangedError(self, url, retries) 11:21:46 11:21:46 # Ensure that the URL we're connecting to is properly encoded 11:21:46 if url.startswith("/"): 11:21:46 url = to_str(_encode_target(url)) 11:21:46 else: 11:21:46 url = to_str(parsed_url.url) 11:21:46 11:21:46 conn = None 11:21:46 11:21:46 # Track whether `conn` needs to be released before 11:21:46 # returning/raising/recursing. Update this variable if necessary, and 11:21:46 # leave `release_conn` constant throughout the function. That way, if 11:21:46 # the function recurses, the original value of `release_conn` will be 11:21:46 # passed down into the recursive call, and its value will be respected. 11:21:46 # 11:21:46 # See issue #651 [1] for details. 11:21:46 # 11:21:46 # [1] 11:21:46 release_this_conn = release_conn 11:21:46 11:21:46 http_tunnel_required = connection_requires_http_tunnel( 11:21:46 self.proxy, self.proxy_config, destination_scheme 11:21:46 ) 11:21:46 11:21:46 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:46 # have to copy the headers dict so we can safely change it without those 11:21:46 # changes being reflected in anyone else's copy. 11:21:46 if not http_tunnel_required: 11:21:46 headers = headers.copy() # type: ignore[attr-defined] 11:21:46 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:46 11:21:46 # Must keep the exception bound to a separate variable or else Python 3 11:21:46 # complains about UnboundLocalError. 11:21:46 err = None 11:21:46 11:21:46 # Keep track of whether we cleanly exited the except block. This 11:21:46 # ensures we do proper cleanup in finally. 11:21:46 clean_exit = False 11:21:46 11:21:46 # Rewind body position, if needed. Record current position 11:21:46 # for future rewinds in the event of a redirect/retry. 11:21:46 body_pos = set_file_position(body, body_pos) 11:21:46 11:21:46 try: 11:21:46 # Request a connection from the queue. 11:21:46 timeout_obj = self._get_timeout(timeout) 11:21:46 conn = self._get_conn(timeout=pool_timeout) 11:21:46 11:21:46 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:46 11:21:46 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:46 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:46 try: 11:21:46 self._prepare_proxy(conn) 11:21:46 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:46 self._raise_timeout( 11:21:46 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:46 ) 11:21:46 raise 11:21:46 11:21:46 # If we're going to release the connection in ``finally:``, then 11:21:46 # the response doesn't need to know about the connection. Otherwise 11:21:46 # it will also try to release it and we'll have a double-release 11:21:46 # mess. 11:21:46 response_conn = conn if not release_conn else None 11:21:46 11:21:46 # Make the request on the HTTPConnection object 11:21:46 > response = self._make_request( 11:21:46 conn, 11:21:46 method, 11:21:46 url, 11:21:46 timeout=timeout_obj, 11:21:46 body=body, 11:21:46 headers=headers, 11:21:46 chunked=chunked, 11:21:46 retries=retries, 11:21:46 response_conn=response_conn, 11:21:46 preload_content=preload_content, 11:21:46 decode_content=decode_content, 11:21:46 **response_kw, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:46 conn.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:46 self.endheaders() 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:46 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:46 self.send(msg) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:46 self.connect() 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:46 self.sock = self._new_conn() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 except socket.gaierror as e: 11:21:46 raise NameResolutionError(self.host, self, e) from e 11:21:46 except SocketTimeout as e: 11:21:46 raise ConnectTimeoutError( 11:21:46 self, 11:21:46 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:46 ) from e 11:21:46 11:21:46 except OSError as e: 11:21:46 > raise NewConnectionError( 11:21:46 self, f"Failed to establish a new connection: {e}" 11:21:46 ) from e 11:21:46 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 > resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:46 retries = retries.increment( 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 method = 'POST' 11:21:46 url = '/rests/operations/transportpce-networkutils:init-rdm-xpdr-links' 11:21:46 response = None 11:21:46 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:46 _pool = 11:21:46 _stacktrace = 11:21:46 11:21:46 def increment( 11:21:46 self, 11:21:46 method: str | None = None, 11:21:46 url: str | None = None, 11:21:46 response: BaseHTTPResponse | None = None, 11:21:46 error: Exception | None = None, 11:21:46 _pool: ConnectionPool | None = None, 11:21:46 _stacktrace: TracebackType | None = None, 11:21:46 ) -> Self: 11:21:46 """Return a new Retry object with incremented retry counters. 11:21:46 11:21:46 :param response: A response object, or None, if the server did not 11:21:46 return a response. 11:21:46 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:46 :param Exception error: An error encountered during the request, or 11:21:46 None if the response was received successfully. 11:21:46 11:21:46 :return: A new ``Retry`` object. 11:21:46 """ 11:21:46 if self.total is False and error: 11:21:46 # Disabled, indicate to re-raise the error. 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 11:21:46 total = self.total 11:21:46 if total is not None: 11:21:46 total -= 1 11:21:46 11:21:46 connect = self.connect 11:21:46 read = self.read 11:21:46 redirect = self.redirect 11:21:46 status_count = self.status 11:21:46 other = self.other 11:21:46 cause = "unknown" 11:21:46 status = None 11:21:46 redirect_location = None 11:21:46 11:21:46 if error and self._is_connection_error(error): 11:21:46 # Connect retry? 11:21:46 if connect is False: 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif connect is not None: 11:21:46 connect -= 1 11:21:46 11:21:46 elif error and self._is_read_error(error): 11:21:46 # Read retry? 11:21:46 if read is False or method is None or not self._is_method_retryable(method): 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif read is not None: 11:21:46 read -= 1 11:21:46 11:21:46 elif error: 11:21:46 # Other retry? 11:21:46 if other is not None: 11:21:46 other -= 1 11:21:46 11:21:46 elif response and response.get_redirect_location(): 11:21:46 # Redirect retry? 11:21:46 if redirect is not None: 11:21:46 redirect -= 1 11:21:46 cause = "too many redirects" 11:21:46 response_redirect_location = response.get_redirect_location() 11:21:46 if response_redirect_location: 11:21:46 redirect_location = response_redirect_location 11:21:46 status = response.status 11:21:46 11:21:46 else: 11:21:46 # Incrementing because of a server error like a 500 in 11:21:46 # status_forcelist and the given method is in the allowed_methods 11:21:46 cause = ResponseError.GENERIC_ERROR 11:21:46 if response and response.status: 11:21:46 if status_count is not None: 11:21:46 status_count -= 1 11:21:46 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:46 status = response.status 11:21:46 11:21:46 history = self.history + ( 11:21:46 RequestHistory(method, url, error, status, redirect_location), 11:21:46 ) 11:21:46 11:21:46 new_retry = self.new( 11:21:46 total=total, 11:21:46 connect=connect, 11:21:46 read=read, 11:21:46 redirect=redirect, 11:21:46 status=status_count, 11:21:46 other=other, 11:21:46 history=history, 11:21:46 ) 11:21:46 11:21:46 if new_retry.is_exhausted(): 11:21:46 reason = error or ResponseError(cause) 11:21:46 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:46 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-rdm-xpdr-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_21_connect_roadmC_PP2_to_xpdrC_N2(self): 11:21:46 > response = test_utils.transportpce_api_rpc_request( 11:21:46 'transportpce-networkutils', 'init-rdm-xpdr-links', 11:21:46 {'links-input': {'xpdr-node': 'XPDRC01', 'xpdr-num': '1', 'network-num': '2', 11:21:46 'rdm-node': 'ROADMC01', 'srg-num': '1', 'termination-point-num': 'SRG1-PP2-TXRX'}}) 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:313: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:687: in transportpce_api_rpc_request 11:21:46 response = post_request(url, data) 11:21:46 transportpce_tests/common/test_utils.py:142: in post_request 11:21:46 return requests.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:46 return session.request(method=method, url=url, **kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:46 resp = self.send(prep, **send_kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:46 r = adapter.send(request, **kwargs) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 except (ProtocolError, OSError) as err: 11:21:46 raise ConnectionError(err, request=request) 11:21:46 11:21:46 except MaxRetryError as e: 11:21:46 if isinstance(e.reason, ConnectTimeoutError): 11:21:46 # TODO: Remove this in 3.0.0: see #2811 11:21:46 if not isinstance(e.reason, NewConnectionError): 11:21:46 raise ConnectTimeout(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, ResponseError): 11:21:46 raise RetryError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _ProxyError): 11:21:46 raise ProxyError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _SSLError): 11:21:46 # This branch is for urllib3 v1.22 and later. 11:21:46 raise SSLError(e, request=request) 11:21:46 11:21:46 > raise ConnectionError(e, request=request) 11:21:46 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-rdm-xpdr-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_21_connect_roadmC_PP2_to_xpdrC_N2 11:21:46 _____________ TransportPCEFulltesting.test_22_create_eth_service2 ______________ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 > sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:46 raise err 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:46 socket_options = [(6, 1, 1)] 11:21:46 11:21:46 def create_connection( 11:21:46 address: tuple[str, int], 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 source_address: tuple[str, int] | None = None, 11:21:46 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:46 ) -> socket.socket: 11:21:46 """Connect to *address* and return the socket object. 11:21:46 11:21:46 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:46 port)``) and return the socket object. Passing the optional 11:21:46 *timeout* parameter will set the timeout on the socket instance 11:21:46 before attempting to connect. If no *timeout* is supplied, the 11:21:46 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:46 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:46 for the socket to bind as a source address before making the connection. 11:21:46 An host of '' or port 0 tells the OS to use the default. 11:21:46 """ 11:21:46 11:21:46 host, port = address 11:21:46 if host.startswith("["): 11:21:46 host = host.strip("[]") 11:21:46 err = None 11:21:46 11:21:46 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:46 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:46 # The original create_connection function always returns all records. 11:21:46 family = allowed_gai_family() 11:21:46 11:21:46 try: 11:21:46 host.encode("idna") 11:21:46 except UnicodeError: 11:21:46 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:46 11:21:46 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:46 af, socktype, proto, canonname, sa = res 11:21:46 sock = None 11:21:46 try: 11:21:46 sock = socket.socket(af, socktype, proto) 11:21:46 11:21:46 # If provided, set socket level options before connecting. 11:21:46 _set_socket_options(sock, socket_options) 11:21:46 11:21:46 if timeout is not _DEFAULT_TIMEOUT: 11:21:46 sock.settimeout(timeout) 11:21:46 if source_address: 11:21:46 sock.bind(source_address) 11:21:46 > sock.connect(sa) 11:21:46 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 method = 'POST', url = '/rests/operations/org-openroadm-service:service-create' 11:21:46 body = '{"input": {"sdnc-request-header": {"request-id": "e3028bae-a90f-4ddd-a83f-cf224eba0e58", "rpc-action": "service-creat...-direction": [{"index": 0}], "optic-type": "gray"}, "due-date": "2016-11-28T00:00:01Z", "operator-contact": "pw1234"}}' 11:21:46 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '784', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:46 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 redirect = False, assert_same_host = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:46 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:46 decode_content = False, response_kw = {} 11:21:46 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/org-openroadm-service:service-create', query=None, fragment=None) 11:21:46 destination_scheme = None, conn = None, release_this_conn = True 11:21:46 http_tunnel_required = False, err = None, clean_exit = False 11:21:46 11:21:46 def urlopen( # type: ignore[override] 11:21:46 self, 11:21:46 method: str, 11:21:46 url: str, 11:21:46 body: _TYPE_BODY | None = None, 11:21:46 headers: typing.Mapping[str, str] | None = None, 11:21:46 retries: Retry | bool | int | None = None, 11:21:46 redirect: bool = True, 11:21:46 assert_same_host: bool = True, 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 pool_timeout: int | None = None, 11:21:46 release_conn: bool | None = None, 11:21:46 chunked: bool = False, 11:21:46 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:46 preload_content: bool = True, 11:21:46 decode_content: bool = True, 11:21:46 **response_kw: typing.Any, 11:21:46 ) -> BaseHTTPResponse: 11:21:46 """ 11:21:46 Get a connection from the pool and perform an HTTP request. This is the 11:21:46 lowest level call for making a request, so you'll need to specify all 11:21:46 the raw details. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 More commonly, it's appropriate to use a convenience method 11:21:46 such as :meth:`request`. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 `release_conn` will only behave as expected if 11:21:46 `preload_content=False` because we want to make 11:21:46 `preload_content=False` the default behaviour someday soon without 11:21:46 breaking backwards compatibility. 11:21:46 11:21:46 :param method: 11:21:46 HTTP request method (such as GET, POST, PUT, etc.) 11:21:46 11:21:46 :param url: 11:21:46 The URL to perform the request on. 11:21:46 11:21:46 :param body: 11:21:46 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:46 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:46 11:21:46 :param headers: 11:21:46 Dictionary of custom headers to send, such as User-Agent, 11:21:46 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:46 these headers completely replace any pool-specific headers. 11:21:46 11:21:46 :param retries: 11:21:46 Configure the number of retries to allow before raising a 11:21:46 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:46 11:21:46 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:46 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:46 over different types of retries. 11:21:46 Pass an integer number to retry connection errors that many times, 11:21:46 but no other types of errors. Pass zero to never retry. 11:21:46 11:21:46 If ``False``, then retries are disabled and any exception is raised 11:21:46 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:46 the redirect response will be returned. 11:21:46 11:21:46 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:46 11:21:46 :param redirect: 11:21:46 If True, automatically handle redirects (status codes 301, 302, 11:21:46 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:46 will disable redirect, too. 11:21:46 11:21:46 :param assert_same_host: 11:21:46 If ``True``, will make sure that the host of the pool requests is 11:21:46 consistent else will raise HostChangedError. When ``False``, you can 11:21:46 use the pool on an HTTP proxy and request foreign hosts. 11:21:46 11:21:46 :param timeout: 11:21:46 If specified, overrides the default timeout for this one 11:21:46 request. It may be a float (in seconds) or an instance of 11:21:46 :class:`urllib3.util.Timeout`. 11:21:46 11:21:46 :param pool_timeout: 11:21:46 If set and the pool is set to block=True, then this method will 11:21:46 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:46 connection is available within the time period. 11:21:46 11:21:46 :param bool preload_content: 11:21:46 If True, the response's body will be preloaded into memory. 11:21:46 11:21:46 :param bool decode_content: 11:21:46 If True, will attempt to decode the body based on the 11:21:46 'content-encoding' header. 11:21:46 11:21:46 :param release_conn: 11:21:46 If False, then the urlopen call will not release the connection 11:21:46 back into the pool once a response is received (but will release if 11:21:46 you read the entire contents of the response such as when 11:21:46 `preload_content=True`). This is useful if you're not preloading 11:21:46 the response's content immediately. You will need to call 11:21:46 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:46 back into the pool. If None, it takes the value of ``preload_content`` 11:21:46 which defaults to ``True``. 11:21:46 11:21:46 :param bool chunked: 11:21:46 If True, urllib3 will send the body using chunked transfer 11:21:46 encoding. Otherwise, urllib3 will send the body using the standard 11:21:46 content-length form. Defaults to False. 11:21:46 11:21:46 :param int body_pos: 11:21:46 Position to seek to in file-like body in the event of a retry or 11:21:46 redirect. Typically this won't need to be set because urllib3 will 11:21:46 auto-populate the value when needed. 11:21:46 """ 11:21:46 parsed_url = parse_url(url) 11:21:46 destination_scheme = parsed_url.scheme 11:21:46 11:21:46 if headers is None: 11:21:46 headers = self.headers 11:21:46 11:21:46 if not isinstance(retries, Retry): 11:21:46 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:46 11:21:46 if release_conn is None: 11:21:46 release_conn = preload_content 11:21:46 11:21:46 # Check host 11:21:46 if assert_same_host and not self.is_same_host(url): 11:21:46 raise HostChangedError(self, url, retries) 11:21:46 11:21:46 # Ensure that the URL we're connecting to is properly encoded 11:21:46 if url.startswith("/"): 11:21:46 url = to_str(_encode_target(url)) 11:21:46 else: 11:21:46 url = to_str(parsed_url.url) 11:21:46 11:21:46 conn = None 11:21:46 11:21:46 # Track whether `conn` needs to be released before 11:21:46 # returning/raising/recursing. Update this variable if necessary, and 11:21:46 # leave `release_conn` constant throughout the function. That way, if 11:21:46 # the function recurses, the original value of `release_conn` will be 11:21:46 # passed down into the recursive call, and its value will be respected. 11:21:46 # 11:21:46 # See issue #651 [1] for details. 11:21:46 # 11:21:46 # [1] 11:21:46 release_this_conn = release_conn 11:21:46 11:21:46 http_tunnel_required = connection_requires_http_tunnel( 11:21:46 self.proxy, self.proxy_config, destination_scheme 11:21:46 ) 11:21:46 11:21:46 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:46 # have to copy the headers dict so we can safely change it without those 11:21:46 # changes being reflected in anyone else's copy. 11:21:46 if not http_tunnel_required: 11:21:46 headers = headers.copy() # type: ignore[attr-defined] 11:21:46 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:46 11:21:46 # Must keep the exception bound to a separate variable or else Python 3 11:21:46 # complains about UnboundLocalError. 11:21:46 err = None 11:21:46 11:21:46 # Keep track of whether we cleanly exited the except block. This 11:21:46 # ensures we do proper cleanup in finally. 11:21:46 clean_exit = False 11:21:46 11:21:46 # Rewind body position, if needed. Record current position 11:21:46 # for future rewinds in the event of a redirect/retry. 11:21:46 body_pos = set_file_position(body, body_pos) 11:21:46 11:21:46 try: 11:21:46 # Request a connection from the queue. 11:21:46 timeout_obj = self._get_timeout(timeout) 11:21:46 conn = self._get_conn(timeout=pool_timeout) 11:21:46 11:21:46 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:46 11:21:46 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:46 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:46 try: 11:21:46 self._prepare_proxy(conn) 11:21:46 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:46 self._raise_timeout( 11:21:46 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:46 ) 11:21:46 raise 11:21:46 11:21:46 # If we're going to release the connection in ``finally:``, then 11:21:46 # the response doesn't need to know about the connection. Otherwise 11:21:46 # it will also try to release it and we'll have a double-release 11:21:46 # mess. 11:21:46 response_conn = conn if not release_conn else None 11:21:46 11:21:46 # Make the request on the HTTPConnection object 11:21:46 > response = self._make_request( 11:21:46 conn, 11:21:46 method, 11:21:46 url, 11:21:46 timeout=timeout_obj, 11:21:46 body=body, 11:21:46 headers=headers, 11:21:46 chunked=chunked, 11:21:46 retries=retries, 11:21:46 response_conn=response_conn, 11:21:46 preload_content=preload_content, 11:21:46 decode_content=decode_content, 11:21:46 **response_kw, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:46 conn.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:46 self.endheaders() 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:46 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:46 self.send(msg) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:46 self.connect() 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:46 self.sock = self._new_conn() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 except socket.gaierror as e: 11:21:46 raise NameResolutionError(self.host, self, e) from e 11:21:46 except SocketTimeout as e: 11:21:46 raise ConnectTimeoutError( 11:21:46 self, 11:21:46 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:46 ) from e 11:21:46 11:21:46 except OSError as e: 11:21:46 > raise NewConnectionError( 11:21:46 self, f"Failed to establish a new connection: {e}" 11:21:46 ) from e 11:21:46 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 > resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:46 retries = retries.increment( 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 method = 'POST', url = '/rests/operations/org-openroadm-service:service-create' 11:21:46 response = None 11:21:46 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:46 _pool = 11:21:46 _stacktrace = 11:21:46 11:21:46 def increment( 11:21:46 self, 11:21:46 method: str | None = None, 11:21:46 url: str | None = None, 11:21:46 response: BaseHTTPResponse | None = None, 11:21:46 error: Exception | None = None, 11:21:46 _pool: ConnectionPool | None = None, 11:21:46 _stacktrace: TracebackType | None = None, 11:21:46 ) -> Self: 11:21:46 """Return a new Retry object with incremented retry counters. 11:21:46 11:21:46 :param response: A response object, or None, if the server did not 11:21:46 return a response. 11:21:46 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:46 :param Exception error: An error encountered during the request, or 11:21:46 None if the response was received successfully. 11:21:46 11:21:46 :return: A new ``Retry`` object. 11:21:46 """ 11:21:46 if self.total is False and error: 11:21:46 # Disabled, indicate to re-raise the error. 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 11:21:46 total = self.total 11:21:46 if total is not None: 11:21:46 total -= 1 11:21:46 11:21:46 connect = self.connect 11:21:46 read = self.read 11:21:46 redirect = self.redirect 11:21:46 status_count = self.status 11:21:46 other = self.other 11:21:46 cause = "unknown" 11:21:46 status = None 11:21:46 redirect_location = None 11:21:46 11:21:46 if error and self._is_connection_error(error): 11:21:46 # Connect retry? 11:21:46 if connect is False: 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif connect is not None: 11:21:46 connect -= 1 11:21:46 11:21:46 elif error and self._is_read_error(error): 11:21:46 # Read retry? 11:21:46 if read is False or method is None or not self._is_method_retryable(method): 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif read is not None: 11:21:46 read -= 1 11:21:46 11:21:46 elif error: 11:21:46 # Other retry? 11:21:46 if other is not None: 11:21:46 other -= 1 11:21:46 11:21:46 elif response and response.get_redirect_location(): 11:21:46 # Redirect retry? 11:21:46 if redirect is not None: 11:21:46 redirect -= 1 11:21:46 cause = "too many redirects" 11:21:46 response_redirect_location = response.get_redirect_location() 11:21:46 if response_redirect_location: 11:21:46 redirect_location = response_redirect_location 11:21:46 status = response.status 11:21:46 11:21:46 else: 11:21:46 # Incrementing because of a server error like a 500 in 11:21:46 # status_forcelist and the given method is in the allowed_methods 11:21:46 cause = ResponseError.GENERIC_ERROR 11:21:46 if response and response.status: 11:21:46 if status_count is not None: 11:21:46 status_count -= 1 11:21:46 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:46 status = response.status 11:21:46 11:21:46 history = self.history + ( 11:21:46 RequestHistory(method, url, error, status, redirect_location), 11:21:46 ) 11:21:46 11:21:46 new_retry = self.new( 11:21:46 total=total, 11:21:46 connect=connect, 11:21:46 read=read, 11:21:46 redirect=redirect, 11:21:46 status=status_count, 11:21:46 other=other, 11:21:46 history=history, 11:21:46 ) 11:21:46 11:21:46 if new_retry.is_exhausted(): 11:21:46 reason = error or ResponseError(cause) 11:21:46 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:46 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/org-openroadm-service:service-create (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_22_create_eth_service2(self): 11:21:46 self.cr_serv_input_data["service-name"] = "service2" 11:21:46 > response = test_utils.transportpce_api_rpc_request( 11:21:46 'org-openroadm-service', 'service-create', 11:21:46 self.cr_serv_input_data) 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:323: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:687: in transportpce_api_rpc_request 11:21:46 response = post_request(url, data) 11:21:46 transportpce_tests/common/test_utils.py:142: in post_request 11:21:46 return requests.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:46 return session.request(method=method, url=url, **kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:46 resp = self.send(prep, **send_kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:46 r = adapter.send(request, **kwargs) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 except (ProtocolError, OSError) as err: 11:21:46 raise ConnectionError(err, request=request) 11:21:46 11:21:46 except MaxRetryError as e: 11:21:46 if isinstance(e.reason, ConnectTimeoutError): 11:21:46 # TODO: Remove this in 3.0.0: see #2811 11:21:46 if not isinstance(e.reason, NewConnectionError): 11:21:46 raise ConnectTimeout(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, ResponseError): 11:21:46 raise RetryError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _ProxyError): 11:21:46 raise ProxyError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _SSLError): 11:21:46 # This branch is for urllib3 v1.22 and later. 11:21:46 raise SSLError(e, request=request) 11:21:46 11:21:46 > raise ConnectionError(e, request=request) 11:21:46 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/org-openroadm-service:service-create (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_22_create_eth_service2 11:21:46 _______________ TransportPCEFulltesting.test_23_get_eth_service2 _______________ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 > sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:46 raise err 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:46 socket_options = [(6, 1, 1)] 11:21:46 11:21:46 def create_connection( 11:21:46 address: tuple[str, int], 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 source_address: tuple[str, int] | None = None, 11:21:46 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:46 ) -> socket.socket: 11:21:46 """Connect to *address* and return the socket object. 11:21:46 11:21:46 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:46 port)``) and return the socket object. Passing the optional 11:21:46 *timeout* parameter will set the timeout on the socket instance 11:21:46 before attempting to connect. If no *timeout* is supplied, the 11:21:46 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:46 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:46 for the socket to bind as a source address before making the connection. 11:21:46 An host of '' or port 0 tells the OS to use the default. 11:21:46 """ 11:21:46 11:21:46 host, port = address 11:21:46 if host.startswith("["): 11:21:46 host = host.strip("[]") 11:21:46 err = None 11:21:46 11:21:46 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:46 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:46 # The original create_connection function always returns all records. 11:21:46 family = allowed_gai_family() 11:21:46 11:21:46 try: 11:21:46 host.encode("idna") 11:21:46 except UnicodeError: 11:21:46 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:46 11:21:46 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:46 af, socktype, proto, canonname, sa = res 11:21:46 sock = None 11:21:46 try: 11:21:46 sock = socket.socket(af, socktype, proto) 11:21:46 11:21:46 # If provided, set socket level options before connecting. 11:21:46 _set_socket_options(sock, socket_options) 11:21:46 11:21:46 if timeout is not _DEFAULT_TIMEOUT: 11:21:46 sock.settimeout(timeout) 11:21:46 if source_address: 11:21:46 sock.bind(source_address) 11:21:46 > sock.connect(sa) 11:21:46 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 method = 'GET' 11:21:46 url = '/rests/data/org-openroadm-service:service-list/services=service2?content=nonconfig' 11:21:46 body = None 11:21:46 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:46 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 redirect = False, assert_same_host = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:46 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:46 decode_content = False, response_kw = {} 11:21:46 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/org-openroadm-service:service-list/services=service2', query='content=nonconfig', fragment=None) 11:21:46 destination_scheme = None, conn = None, release_this_conn = True 11:21:46 http_tunnel_required = False, err = None, clean_exit = False 11:21:46 11:21:46 def urlopen( # type: ignore[override] 11:21:46 self, 11:21:46 method: str, 11:21:46 url: str, 11:21:46 body: _TYPE_BODY | None = None, 11:21:46 headers: typing.Mapping[str, str] | None = None, 11:21:46 retries: Retry | bool | int | None = None, 11:21:46 redirect: bool = True, 11:21:46 assert_same_host: bool = True, 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 pool_timeout: int | None = None, 11:21:46 release_conn: bool | None = None, 11:21:46 chunked: bool = False, 11:21:46 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:46 preload_content: bool = True, 11:21:46 decode_content: bool = True, 11:21:46 **response_kw: typing.Any, 11:21:46 ) -> BaseHTTPResponse: 11:21:46 """ 11:21:46 Get a connection from the pool and perform an HTTP request. This is the 11:21:46 lowest level call for making a request, so you'll need to specify all 11:21:46 the raw details. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 More commonly, it's appropriate to use a convenience method 11:21:46 such as :meth:`request`. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 `release_conn` will only behave as expected if 11:21:46 `preload_content=False` because we want to make 11:21:46 `preload_content=False` the default behaviour someday soon without 11:21:46 breaking backwards compatibility. 11:21:46 11:21:46 :param method: 11:21:46 HTTP request method (such as GET, POST, PUT, etc.) 11:21:46 11:21:46 :param url: 11:21:46 The URL to perform the request on. 11:21:46 11:21:46 :param body: 11:21:46 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:46 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:46 11:21:46 :param headers: 11:21:46 Dictionary of custom headers to send, such as User-Agent, 11:21:46 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:46 these headers completely replace any pool-specific headers. 11:21:46 11:21:46 :param retries: 11:21:46 Configure the number of retries to allow before raising a 11:21:46 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:46 11:21:46 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:46 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:46 over different types of retries. 11:21:46 Pass an integer number to retry connection errors that many times, 11:21:46 but no other types of errors. Pass zero to never retry. 11:21:46 11:21:46 If ``False``, then retries are disabled and any exception is raised 11:21:46 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:46 the redirect response will be returned. 11:21:46 11:21:46 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:46 11:21:46 :param redirect: 11:21:46 If True, automatically handle redirects (status codes 301, 302, 11:21:46 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:46 will disable redirect, too. 11:21:46 11:21:46 :param assert_same_host: 11:21:46 If ``True``, will make sure that the host of the pool requests is 11:21:46 consistent else will raise HostChangedError. When ``False``, you can 11:21:46 use the pool on an HTTP proxy and request foreign hosts. 11:21:46 11:21:46 :param timeout: 11:21:46 If specified, overrides the default timeout for this one 11:21:46 request. It may be a float (in seconds) or an instance of 11:21:46 :class:`urllib3.util.Timeout`. 11:21:46 11:21:46 :param pool_timeout: 11:21:46 If set and the pool is set to block=True, then this method will 11:21:46 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:46 connection is available within the time period. 11:21:46 11:21:46 :param bool preload_content: 11:21:46 If True, the response's body will be preloaded into memory. 11:21:46 11:21:46 :param bool decode_content: 11:21:46 If True, will attempt to decode the body based on the 11:21:46 'content-encoding' header. 11:21:46 11:21:46 :param release_conn: 11:21:46 If False, then the urlopen call will not release the connection 11:21:46 back into the pool once a response is received (but will release if 11:21:46 you read the entire contents of the response such as when 11:21:46 `preload_content=True`). This is useful if you're not preloading 11:21:46 the response's content immediately. You will need to call 11:21:46 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:46 back into the pool. If None, it takes the value of ``preload_content`` 11:21:46 which defaults to ``True``. 11:21:46 11:21:46 :param bool chunked: 11:21:46 If True, urllib3 will send the body using chunked transfer 11:21:46 encoding. Otherwise, urllib3 will send the body using the standard 11:21:46 content-length form. Defaults to False. 11:21:46 11:21:46 :param int body_pos: 11:21:46 Position to seek to in file-like body in the event of a retry or 11:21:46 redirect. Typically this won't need to be set because urllib3 will 11:21:46 auto-populate the value when needed. 11:21:46 """ 11:21:46 parsed_url = parse_url(url) 11:21:46 destination_scheme = parsed_url.scheme 11:21:46 11:21:46 if headers is None: 11:21:46 headers = self.headers 11:21:46 11:21:46 if not isinstance(retries, Retry): 11:21:46 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:46 11:21:46 if release_conn is None: 11:21:46 release_conn = preload_content 11:21:46 11:21:46 # Check host 11:21:46 if assert_same_host and not self.is_same_host(url): 11:21:46 raise HostChangedError(self, url, retries) 11:21:46 11:21:46 # Ensure that the URL we're connecting to is properly encoded 11:21:46 if url.startswith("/"): 11:21:46 url = to_str(_encode_target(url)) 11:21:46 else: 11:21:46 url = to_str(parsed_url.url) 11:21:46 11:21:46 conn = None 11:21:46 11:21:46 # Track whether `conn` needs to be released before 11:21:46 # returning/raising/recursing. Update this variable if necessary, and 11:21:46 # leave `release_conn` constant throughout the function. That way, if 11:21:46 # the function recurses, the original value of `release_conn` will be 11:21:46 # passed down into the recursive call, and its value will be respected. 11:21:46 # 11:21:46 # See issue #651 [1] for details. 11:21:46 # 11:21:46 # [1] 11:21:46 release_this_conn = release_conn 11:21:46 11:21:46 http_tunnel_required = connection_requires_http_tunnel( 11:21:46 self.proxy, self.proxy_config, destination_scheme 11:21:46 ) 11:21:46 11:21:46 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:46 # have to copy the headers dict so we can safely change it without those 11:21:46 # changes being reflected in anyone else's copy. 11:21:46 if not http_tunnel_required: 11:21:46 headers = headers.copy() # type: ignore[attr-defined] 11:21:46 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:46 11:21:46 # Must keep the exception bound to a separate variable or else Python 3 11:21:46 # complains about UnboundLocalError. 11:21:46 err = None 11:21:46 11:21:46 # Keep track of whether we cleanly exited the except block. This 11:21:46 # ensures we do proper cleanup in finally. 11:21:46 clean_exit = False 11:21:46 11:21:46 # Rewind body position, if needed. Record current position 11:21:46 # for future rewinds in the event of a redirect/retry. 11:21:46 body_pos = set_file_position(body, body_pos) 11:21:46 11:21:46 try: 11:21:46 # Request a connection from the queue. 11:21:46 timeout_obj = self._get_timeout(timeout) 11:21:46 conn = self._get_conn(timeout=pool_timeout) 11:21:46 11:21:46 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:46 11:21:46 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:46 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:46 try: 11:21:46 self._prepare_proxy(conn) 11:21:46 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:46 self._raise_timeout( 11:21:46 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:46 ) 11:21:46 raise 11:21:46 11:21:46 # If we're going to release the connection in ``finally:``, then 11:21:46 # the response doesn't need to know about the connection. Otherwise 11:21:46 # it will also try to release it and we'll have a double-release 11:21:46 # mess. 11:21:46 response_conn = conn if not release_conn else None 11:21:46 11:21:46 # Make the request on the HTTPConnection object 11:21:46 > response = self._make_request( 11:21:46 conn, 11:21:46 method, 11:21:46 url, 11:21:46 timeout=timeout_obj, 11:21:46 body=body, 11:21:46 headers=headers, 11:21:46 chunked=chunked, 11:21:46 retries=retries, 11:21:46 response_conn=response_conn, 11:21:46 preload_content=preload_content, 11:21:46 decode_content=decode_content, 11:21:46 **response_kw, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:46 conn.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:46 self.endheaders() 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:46 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:46 self.send(msg) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:46 self.connect() 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:46 self.sock = self._new_conn() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 except socket.gaierror as e: 11:21:46 raise NameResolutionError(self.host, self, e) from e 11:21:46 except SocketTimeout as e: 11:21:46 raise ConnectTimeoutError( 11:21:46 self, 11:21:46 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:46 ) from e 11:21:46 11:21:46 except OSError as e: 11:21:46 > raise NewConnectionError( 11:21:46 self, f"Failed to establish a new connection: {e}" 11:21:46 ) from e 11:21:46 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 > resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:46 retries = retries.increment( 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 method = 'GET' 11:21:46 url = '/rests/data/org-openroadm-service:service-list/services=service2?content=nonconfig' 11:21:46 response = None 11:21:46 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:46 _pool = 11:21:46 _stacktrace = 11:21:46 11:21:46 def increment( 11:21:46 self, 11:21:46 method: str | None = None, 11:21:46 url: str | None = None, 11:21:46 response: BaseHTTPResponse | None = None, 11:21:46 error: Exception | None = None, 11:21:46 _pool: ConnectionPool | None = None, 11:21:46 _stacktrace: TracebackType | None = None, 11:21:46 ) -> Self: 11:21:46 """Return a new Retry object with incremented retry counters. 11:21:46 11:21:46 :param response: A response object, or None, if the server did not 11:21:46 return a response. 11:21:46 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:46 :param Exception error: An error encountered during the request, or 11:21:46 None if the response was received successfully. 11:21:46 11:21:46 :return: A new ``Retry`` object. 11:21:46 """ 11:21:46 if self.total is False and error: 11:21:46 # Disabled, indicate to re-raise the error. 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 11:21:46 total = self.total 11:21:46 if total is not None: 11:21:46 total -= 1 11:21:46 11:21:46 connect = self.connect 11:21:46 read = self.read 11:21:46 redirect = self.redirect 11:21:46 status_count = self.status 11:21:46 other = self.other 11:21:46 cause = "unknown" 11:21:46 status = None 11:21:46 redirect_location = None 11:21:46 11:21:46 if error and self._is_connection_error(error): 11:21:46 # Connect retry? 11:21:46 if connect is False: 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif connect is not None: 11:21:46 connect -= 1 11:21:46 11:21:46 elif error and self._is_read_error(error): 11:21:46 # Read retry? 11:21:46 if read is False or method is None or not self._is_method_retryable(method): 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif read is not None: 11:21:46 read -= 1 11:21:46 11:21:46 elif error: 11:21:46 # Other retry? 11:21:46 if other is not None: 11:21:46 other -= 1 11:21:46 11:21:46 elif response and response.get_redirect_location(): 11:21:46 # Redirect retry? 11:21:46 if redirect is not None: 11:21:46 redirect -= 1 11:21:46 cause = "too many redirects" 11:21:46 response_redirect_location = response.get_redirect_location() 11:21:46 if response_redirect_location: 11:21:46 redirect_location = response_redirect_location 11:21:46 status = response.status 11:21:46 11:21:46 else: 11:21:46 # Incrementing because of a server error like a 500 in 11:21:46 # status_forcelist and the given method is in the allowed_methods 11:21:46 cause = ResponseError.GENERIC_ERROR 11:21:46 if response and response.status: 11:21:46 if status_count is not None: 11:21:46 status_count -= 1 11:21:46 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:46 status = response.status 11:21:46 11:21:46 history = self.history + ( 11:21:46 RequestHistory(method, url, error, status, redirect_location), 11:21:46 ) 11:21:46 11:21:46 new_retry = self.new( 11:21:46 total=total, 11:21:46 connect=connect, 11:21:46 read=read, 11:21:46 redirect=redirect, 11:21:46 status=status_count, 11:21:46 other=other, 11:21:46 history=history, 11:21:46 ) 11:21:46 11:21:46 if new_retry.is_exhausted(): 11:21:46 reason = error or ResponseError(cause) 11:21:46 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:46 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/org-openroadm-service:service-list/services=service2?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_23_get_eth_service2(self): 11:21:46 > response = test_utils.get_ordm_serv_list_attr_request("services", "service2") 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:332: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:632: in get_ordm_serv_list_attr_request 11:21:46 response = get_request(url[RESTCONF_VERSION].format(*format_args)) 11:21:46 transportpce_tests/common/test_utils.py:116: in get_request 11:21:46 return requests.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:46 return session.request(method=method, url=url, **kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:46 resp = self.send(prep, **send_kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:46 r = adapter.send(request, **kwargs) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 except (ProtocolError, OSError) as err: 11:21:46 raise ConnectionError(err, request=request) 11:21:46 11:21:46 except MaxRetryError as e: 11:21:46 if isinstance(e.reason, ConnectTimeoutError): 11:21:46 # TODO: Remove this in 3.0.0: see #2811 11:21:46 if not isinstance(e.reason, NewConnectionError): 11:21:46 raise ConnectTimeout(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, ResponseError): 11:21:46 raise RetryError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _ProxyError): 11:21:46 raise ProxyError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _SSLError): 11:21:46 # This branch is for urllib3 v1.22 and later. 11:21:46 raise SSLError(e, request=request) 11:21:46 11:21:46 > raise ConnectionError(e, request=request) 11:21:46 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/org-openroadm-service:service-list/services=service2?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_23_get_eth_service2 11:21:46 _______________ TransportPCEFulltesting.test_24_check_xc2_ROADMA _______________ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 > sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:46 raise err 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:46 socket_options = [(6, 1, 1)] 11:21:46 11:21:46 def create_connection( 11:21:46 address: tuple[str, int], 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 source_address: tuple[str, int] | None = None, 11:21:46 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:46 ) -> socket.socket: 11:21:46 """Connect to *address* and return the socket object. 11:21:46 11:21:46 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:46 port)``) and return the socket object. Passing the optional 11:21:46 *timeout* parameter will set the timeout on the socket instance 11:21:46 before attempting to connect. If no *timeout* is supplied, the 11:21:46 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:46 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:46 for the socket to bind as a source address before making the connection. 11:21:46 An host of '' or port 0 tells the OS to use the default. 11:21:46 """ 11:21:46 11:21:46 host, port = address 11:21:46 if host.startswith("["): 11:21:46 host = host.strip("[]") 11:21:46 err = None 11:21:46 11:21:46 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:46 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:46 # The original create_connection function always returns all records. 11:21:46 family = allowed_gai_family() 11:21:46 11:21:46 try: 11:21:46 host.encode("idna") 11:21:46 except UnicodeError: 11:21:46 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:46 11:21:46 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:46 af, socktype, proto, canonname, sa = res 11:21:46 sock = None 11:21:46 try: 11:21:46 sock = socket.socket(af, socktype, proto) 11:21:46 11:21:46 # If provided, set socket level options before connecting. 11:21:46 _set_socket_options(sock, socket_options) 11:21:46 11:21:46 if timeout is not _DEFAULT_TIMEOUT: 11:21:46 sock.settimeout(timeout) 11:21:46 if source_address: 11:21:46 sock.bind(source_address) 11:21:46 > sock.connect(sa) 11:21:46 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 method = 'GET' 11:21:46 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01/yang-ext:mount/org-openroadm-device:org-openroadm-device/roadm-connections=DEG1-TTP-TXRX-SRG1-PP2-TXRX-753:760?content=nonconfig' 11:21:46 body = None 11:21:46 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:46 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 redirect = False, assert_same_host = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:46 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:46 decode_content = False, response_kw = {} 11:21:46 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topolog...:org-openroadm-device/roadm-connections=DEG1-TTP-TXRX-SRG1-PP2-TXRX-753:760', query='content=nonconfig', fragment=None) 11:21:46 destination_scheme = None, conn = None, release_this_conn = True 11:21:46 http_tunnel_required = False, err = None, clean_exit = False 11:21:46 11:21:46 def urlopen( # type: ignore[override] 11:21:46 self, 11:21:46 method: str, 11:21:46 url: str, 11:21:46 body: _TYPE_BODY | None = None, 11:21:46 headers: typing.Mapping[str, str] | None = None, 11:21:46 retries: Retry | bool | int | None = None, 11:21:46 redirect: bool = True, 11:21:46 assert_same_host: bool = True, 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 pool_timeout: int | None = None, 11:21:46 release_conn: bool | None = None, 11:21:46 chunked: bool = False, 11:21:46 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:46 preload_content: bool = True, 11:21:46 decode_content: bool = True, 11:21:46 **response_kw: typing.Any, 11:21:46 ) -> BaseHTTPResponse: 11:21:46 """ 11:21:46 Get a connection from the pool and perform an HTTP request. This is the 11:21:46 lowest level call for making a request, so you'll need to specify all 11:21:46 the raw details. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 More commonly, it's appropriate to use a convenience method 11:21:46 such as :meth:`request`. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 `release_conn` will only behave as expected if 11:21:46 `preload_content=False` because we want to make 11:21:46 `preload_content=False` the default behaviour someday soon without 11:21:46 breaking backwards compatibility. 11:21:46 11:21:46 :param method: 11:21:46 HTTP request method (such as GET, POST, PUT, etc.) 11:21:46 11:21:46 :param url: 11:21:46 The URL to perform the request on. 11:21:46 11:21:46 :param body: 11:21:46 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:46 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:46 11:21:46 :param headers: 11:21:46 Dictionary of custom headers to send, such as User-Agent, 11:21:46 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:46 these headers completely replace any pool-specific headers. 11:21:46 11:21:46 :param retries: 11:21:46 Configure the number of retries to allow before raising a 11:21:46 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:46 11:21:46 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:46 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:46 over different types of retries. 11:21:46 Pass an integer number to retry connection errors that many times, 11:21:46 but no other types of errors. Pass zero to never retry. 11:21:46 11:21:46 If ``False``, then retries are disabled and any exception is raised 11:21:46 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:46 the redirect response will be returned. 11:21:46 11:21:46 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:46 11:21:46 :param redirect: 11:21:46 If True, automatically handle redirects (status codes 301, 302, 11:21:46 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:46 will disable redirect, too. 11:21:46 11:21:46 :param assert_same_host: 11:21:46 If ``True``, will make sure that the host of the pool requests is 11:21:46 consistent else will raise HostChangedError. When ``False``, you can 11:21:46 use the pool on an HTTP proxy and request foreign hosts. 11:21:46 11:21:46 :param timeout: 11:21:46 If specified, overrides the default timeout for this one 11:21:46 request. It may be a float (in seconds) or an instance of 11:21:46 :class:`urllib3.util.Timeout`. 11:21:46 11:21:46 :param pool_timeout: 11:21:46 If set and the pool is set to block=True, then this method will 11:21:46 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:46 connection is available within the time period. 11:21:46 11:21:46 :param bool preload_content: 11:21:46 If True, the response's body will be preloaded into memory. 11:21:46 11:21:46 :param bool decode_content: 11:21:46 If True, will attempt to decode the body based on the 11:21:46 'content-encoding' header. 11:21:46 11:21:46 :param release_conn: 11:21:46 If False, then the urlopen call will not release the connection 11:21:46 back into the pool once a response is received (but will release if 11:21:46 you read the entire contents of the response such as when 11:21:46 `preload_content=True`). This is useful if you're not preloading 11:21:46 the response's content immediately. You will need to call 11:21:46 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:46 back into the pool. If None, it takes the value of ``preload_content`` 11:21:46 which defaults to ``True``. 11:21:46 11:21:46 :param bool chunked: 11:21:46 If True, urllib3 will send the body using chunked transfer 11:21:46 encoding. Otherwise, urllib3 will send the body using the standard 11:21:46 content-length form. Defaults to False. 11:21:46 11:21:46 :param int body_pos: 11:21:46 Position to seek to in file-like body in the event of a retry or 11:21:46 redirect. Typically this won't need to be set because urllib3 will 11:21:46 auto-populate the value when needed. 11:21:46 """ 11:21:46 parsed_url = parse_url(url) 11:21:46 destination_scheme = parsed_url.scheme 11:21:46 11:21:46 if headers is None: 11:21:46 headers = self.headers 11:21:46 11:21:46 if not isinstance(retries, Retry): 11:21:46 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:46 11:21:46 if release_conn is None: 11:21:46 release_conn = preload_content 11:21:46 11:21:46 # Check host 11:21:46 if assert_same_host and not self.is_same_host(url): 11:21:46 raise HostChangedError(self, url, retries) 11:21:46 11:21:46 # Ensure that the URL we're connecting to is properly encoded 11:21:46 if url.startswith("/"): 11:21:46 url = to_str(_encode_target(url)) 11:21:46 else: 11:21:46 url = to_str(parsed_url.url) 11:21:46 11:21:46 conn = None 11:21:46 11:21:46 # Track whether `conn` needs to be released before 11:21:46 # returning/raising/recursing. Update this variable if necessary, and 11:21:46 # leave `release_conn` constant throughout the function. That way, if 11:21:46 # the function recurses, the original value of `release_conn` will be 11:21:46 # passed down into the recursive call, and its value will be respected. 11:21:46 # 11:21:46 # See issue #651 [1] for details. 11:21:46 # 11:21:46 # [1] 11:21:46 release_this_conn = release_conn 11:21:46 11:21:46 http_tunnel_required = connection_requires_http_tunnel( 11:21:46 self.proxy, self.proxy_config, destination_scheme 11:21:46 ) 11:21:46 11:21:46 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:46 # have to copy the headers dict so we can safely change it without those 11:21:46 # changes being reflected in anyone else's copy. 11:21:46 if not http_tunnel_required: 11:21:46 headers = headers.copy() # type: ignore[attr-defined] 11:21:46 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:46 11:21:46 # Must keep the exception bound to a separate variable or else Python 3 11:21:46 # complains about UnboundLocalError. 11:21:46 err = None 11:21:46 11:21:46 # Keep track of whether we cleanly exited the except block. This 11:21:46 # ensures we do proper cleanup in finally. 11:21:46 clean_exit = False 11:21:46 11:21:46 # Rewind body position, if needed. Record current position 11:21:46 # for future rewinds in the event of a redirect/retry. 11:21:46 body_pos = set_file_position(body, body_pos) 11:21:46 11:21:46 try: 11:21:46 # Request a connection from the queue. 11:21:46 timeout_obj = self._get_timeout(timeout) 11:21:46 conn = self._get_conn(timeout=pool_timeout) 11:21:46 11:21:46 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:46 11:21:46 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:46 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:46 try: 11:21:46 self._prepare_proxy(conn) 11:21:46 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:46 self._raise_timeout( 11:21:46 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:46 ) 11:21:46 raise 11:21:46 11:21:46 # If we're going to release the connection in ``finally:``, then 11:21:46 # the response doesn't need to know about the connection. Otherwise 11:21:46 # it will also try to release it and we'll have a double-release 11:21:46 # mess. 11:21:46 response_conn = conn if not release_conn else None 11:21:46 11:21:46 # Make the request on the HTTPConnection object 11:21:46 > response = self._make_request( 11:21:46 conn, 11:21:46 method, 11:21:46 url, 11:21:46 timeout=timeout_obj, 11:21:46 body=body, 11:21:46 headers=headers, 11:21:46 chunked=chunked, 11:21:46 retries=retries, 11:21:46 response_conn=response_conn, 11:21:46 preload_content=preload_content, 11:21:46 decode_content=decode_content, 11:21:46 **response_kw, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:46 conn.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:46 self.endheaders() 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:46 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:46 self.send(msg) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:46 self.connect() 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:46 self.sock = self._new_conn() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 except socket.gaierror as e: 11:21:46 raise NameResolutionError(self.host, self, e) from e 11:21:46 except SocketTimeout as e: 11:21:46 raise ConnectTimeoutError( 11:21:46 self, 11:21:46 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:46 ) from e 11:21:46 11:21:46 except OSError as e: 11:21:46 > raise NewConnectionError( 11:21:46 self, f"Failed to establish a new connection: {e}" 11:21:46 ) from e 11:21:46 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 > resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:46 retries = retries.increment( 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 method = 'GET' 11:21:46 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01/yang-ext:mount/org-openroadm-device:org-openroadm-device/roadm-connections=DEG1-TTP-TXRX-SRG1-PP2-TXRX-753:760?content=nonconfig' 11:21:46 response = None 11:21:46 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:46 _pool = 11:21:46 _stacktrace = 11:21:46 11:21:46 def increment( 11:21:46 self, 11:21:46 method: str | None = None, 11:21:46 url: str | None = None, 11:21:46 response: BaseHTTPResponse | None = None, 11:21:46 error: Exception | None = None, 11:21:46 _pool: ConnectionPool | None = None, 11:21:46 _stacktrace: TracebackType | None = None, 11:21:46 ) -> Self: 11:21:46 """Return a new Retry object with incremented retry counters. 11:21:46 11:21:46 :param response: A response object, or None, if the server did not 11:21:46 return a response. 11:21:46 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:46 :param Exception error: An error encountered during the request, or 11:21:46 None if the response was received successfully. 11:21:46 11:21:46 :return: A new ``Retry`` object. 11:21:46 """ 11:21:46 if self.total is False and error: 11:21:46 # Disabled, indicate to re-raise the error. 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 11:21:46 total = self.total 11:21:46 if total is not None: 11:21:46 total -= 1 11:21:46 11:21:46 connect = self.connect 11:21:46 read = self.read 11:21:46 redirect = self.redirect 11:21:46 status_count = self.status 11:21:46 other = self.other 11:21:46 cause = "unknown" 11:21:46 status = None 11:21:46 redirect_location = None 11:21:46 11:21:46 if error and self._is_connection_error(error): 11:21:46 # Connect retry? 11:21:46 if connect is False: 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif connect is not None: 11:21:46 connect -= 1 11:21:46 11:21:46 elif error and self._is_read_error(error): 11:21:46 # Read retry? 11:21:46 if read is False or method is None or not self._is_method_retryable(method): 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif read is not None: 11:21:46 read -= 1 11:21:46 11:21:46 elif error: 11:21:46 # Other retry? 11:21:46 if other is not None: 11:21:46 other -= 1 11:21:46 11:21:46 elif response and response.get_redirect_location(): 11:21:46 # Redirect retry? 11:21:46 if redirect is not None: 11:21:46 redirect -= 1 11:21:46 cause = "too many redirects" 11:21:46 response_redirect_location = response.get_redirect_location() 11:21:46 if response_redirect_location: 11:21:46 redirect_location = response_redirect_location 11:21:46 status = response.status 11:21:46 11:21:46 else: 11:21:46 # Incrementing because of a server error like a 500 in 11:21:46 # status_forcelist and the given method is in the allowed_methods 11:21:46 cause = ResponseError.GENERIC_ERROR 11:21:46 if response and response.status: 11:21:46 if status_count is not None: 11:21:46 status_count -= 1 11:21:46 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:46 status = response.status 11:21:46 11:21:46 history = self.history + ( 11:21:46 RequestHistory(method, url, error, status, redirect_location), 11:21:46 ) 11:21:46 11:21:46 new_retry = self.new( 11:21:46 total=total, 11:21:46 connect=connect, 11:21:46 read=read, 11:21:46 redirect=redirect, 11:21:46 status=status_count, 11:21:46 other=other, 11:21:46 history=history, 11:21:46 ) 11:21:46 11:21:46 if new_retry.is_exhausted(): 11:21:46 reason = error or ResponseError(cause) 11:21:46 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:46 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01/yang-ext:mount/org-openroadm-device:org-openroadm-device/roadm-connections=DEG1-TTP-TXRX-SRG1-PP2-TXRX-753:760?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_24_check_xc2_ROADMA(self): 11:21:46 > response = test_utils.check_node_attribute_request( 11:21:46 "ROADMA01", "roadm-connections", "DEG1-TTP-TXRX-SRG1-PP2-TXRX-753:760") 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:341: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:404: in check_node_attribute_request 11:21:46 response = get_request(url[RESTCONF_VERSION].format('{}', node, attribute, attribute_value)) 11:21:46 transportpce_tests/common/test_utils.py:116: in get_request 11:21:46 return requests.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:46 return session.request(method=method, url=url, **kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:46 resp = self.send(prep, **send_kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:46 r = adapter.send(request, **kwargs) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 except (ProtocolError, OSError) as err: 11:21:46 raise ConnectionError(err, request=request) 11:21:46 11:21:46 except MaxRetryError as e: 11:21:46 if isinstance(e.reason, ConnectTimeoutError): 11:21:46 # TODO: Remove this in 3.0.0: see #2811 11:21:46 if not isinstance(e.reason, NewConnectionError): 11:21:46 raise ConnectTimeout(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, ResponseError): 11:21:46 raise RetryError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _ProxyError): 11:21:46 raise ProxyError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _SSLError): 11:21:46 # This branch is for urllib3 v1.22 and later. 11:21:46 raise SSLError(e, request=request) 11:21:46 11:21:46 > raise ConnectionError(e, request=request) 11:21:46 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01/yang-ext:mount/org-openroadm-device:org-openroadm-device/roadm-connections=DEG1-TTP-TXRX-SRG1-PP2-TXRX-753:760?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_24_check_xc2_ROADMA 11:21:46 _______________ TransportPCEFulltesting.test_25_check_topo_XPDRA _______________ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 > sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:46 raise err 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:46 socket_options = [(6, 1, 1)] 11:21:46 11:21:46 def create_connection( 11:21:46 address: tuple[str, int], 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 source_address: tuple[str, int] | None = None, 11:21:46 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:46 ) -> socket.socket: 11:21:46 """Connect to *address* and return the socket object. 11:21:46 11:21:46 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:46 port)``) and return the socket object. Passing the optional 11:21:46 *timeout* parameter will set the timeout on the socket instance 11:21:46 before attempting to connect. If no *timeout* is supplied, the 11:21:46 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:46 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:46 for the socket to bind as a source address before making the connection. 11:21:46 An host of '' or port 0 tells the OS to use the default. 11:21:46 """ 11:21:46 11:21:46 host, port = address 11:21:46 if host.startswith("["): 11:21:46 host = host.strip("[]") 11:21:46 err = None 11:21:46 11:21:46 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:46 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:46 # The original create_connection function always returns all records. 11:21:46 family = allowed_gai_family() 11:21:46 11:21:46 try: 11:21:46 host.encode("idna") 11:21:46 except UnicodeError: 11:21:46 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:46 11:21:46 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:46 af, socktype, proto, canonname, sa = res 11:21:46 sock = None 11:21:46 try: 11:21:46 sock = socket.socket(af, socktype, proto) 11:21:46 11:21:46 # If provided, set socket level options before connecting. 11:21:46 _set_socket_options(sock, socket_options) 11:21:46 11:21:46 if timeout is not _DEFAULT_TIMEOUT: 11:21:46 sock.settimeout(timeout) 11:21:46 if source_address: 11:21:46 sock.bind(source_address) 11:21:46 > sock.connect(sa) 11:21:46 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 method = 'GET' 11:21:46 url = '/rests/data/ietf-network:networks/network=openroadm-topology/node=XPDRA01-XPDR1?content=config' 11:21:46 body = None 11:21:46 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:46 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 redirect = False, assert_same_host = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:46 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:46 decode_content = False, response_kw = {} 11:21:46 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/ietf-network:networks/network=openroadm-topology/node=XPDRA01-XPDR1', query='content=config', fragment=None) 11:21:46 destination_scheme = None, conn = None, release_this_conn = True 11:21:46 http_tunnel_required = False, err = None, clean_exit = False 11:21:46 11:21:46 def urlopen( # type: ignore[override] 11:21:46 self, 11:21:46 method: str, 11:21:46 url: str, 11:21:46 body: _TYPE_BODY | None = None, 11:21:46 headers: typing.Mapping[str, str] | None = None, 11:21:46 retries: Retry | bool | int | None = None, 11:21:46 redirect: bool = True, 11:21:46 assert_same_host: bool = True, 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 pool_timeout: int | None = None, 11:21:46 release_conn: bool | None = None, 11:21:46 chunked: bool = False, 11:21:46 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:46 preload_content: bool = True, 11:21:46 decode_content: bool = True, 11:21:46 **response_kw: typing.Any, 11:21:46 ) -> BaseHTTPResponse: 11:21:46 """ 11:21:46 Get a connection from the pool and perform an HTTP request. This is the 11:21:46 lowest level call for making a request, so you'll need to specify all 11:21:46 the raw details. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 More commonly, it's appropriate to use a convenience method 11:21:46 such as :meth:`request`. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 `release_conn` will only behave as expected if 11:21:46 `preload_content=False` because we want to make 11:21:46 `preload_content=False` the default behaviour someday soon without 11:21:46 breaking backwards compatibility. 11:21:46 11:21:46 :param method: 11:21:46 HTTP request method (such as GET, POST, PUT, etc.) 11:21:46 11:21:46 :param url: 11:21:46 The URL to perform the request on. 11:21:46 11:21:46 :param body: 11:21:46 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:46 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:46 11:21:46 :param headers: 11:21:46 Dictionary of custom headers to send, such as User-Agent, 11:21:46 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:46 these headers completely replace any pool-specific headers. 11:21:46 11:21:46 :param retries: 11:21:46 Configure the number of retries to allow before raising a 11:21:46 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:46 11:21:46 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:46 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:46 over different types of retries. 11:21:46 Pass an integer number to retry connection errors that many times, 11:21:46 but no other types of errors. Pass zero to never retry. 11:21:46 11:21:46 If ``False``, then retries are disabled and any exception is raised 11:21:46 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:46 the redirect response will be returned. 11:21:46 11:21:46 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:46 11:21:46 :param redirect: 11:21:46 If True, automatically handle redirects (status codes 301, 302, 11:21:46 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:46 will disable redirect, too. 11:21:46 11:21:46 :param assert_same_host: 11:21:46 If ``True``, will make sure that the host of the pool requests is 11:21:46 consistent else will raise HostChangedError. When ``False``, you can 11:21:46 use the pool on an HTTP proxy and request foreign hosts. 11:21:46 11:21:46 :param timeout: 11:21:46 If specified, overrides the default timeout for this one 11:21:46 request. It may be a float (in seconds) or an instance of 11:21:46 :class:`urllib3.util.Timeout`. 11:21:46 11:21:46 :param pool_timeout: 11:21:46 If set and the pool is set to block=True, then this method will 11:21:46 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:46 connection is available within the time period. 11:21:46 11:21:46 :param bool preload_content: 11:21:46 If True, the response's body will be preloaded into memory. 11:21:46 11:21:46 :param bool decode_content: 11:21:46 If True, will attempt to decode the body based on the 11:21:46 'content-encoding' header. 11:21:46 11:21:46 :param release_conn: 11:21:46 If False, then the urlopen call will not release the connection 11:21:46 back into the pool once a response is received (but will release if 11:21:46 you read the entire contents of the response such as when 11:21:46 `preload_content=True`). This is useful if you're not preloading 11:21:46 the response's content immediately. You will need to call 11:21:46 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:46 back into the pool. If None, it takes the value of ``preload_content`` 11:21:46 which defaults to ``True``. 11:21:46 11:21:46 :param bool chunked: 11:21:46 If True, urllib3 will send the body using chunked transfer 11:21:46 encoding. Otherwise, urllib3 will send the body using the standard 11:21:46 content-length form. Defaults to False. 11:21:46 11:21:46 :param int body_pos: 11:21:46 Position to seek to in file-like body in the event of a retry or 11:21:46 redirect. Typically this won't need to be set because urllib3 will 11:21:46 auto-populate the value when needed. 11:21:46 """ 11:21:46 parsed_url = parse_url(url) 11:21:46 destination_scheme = parsed_url.scheme 11:21:46 11:21:46 if headers is None: 11:21:46 headers = self.headers 11:21:46 11:21:46 if not isinstance(retries, Retry): 11:21:46 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:46 11:21:46 if release_conn is None: 11:21:46 release_conn = preload_content 11:21:46 11:21:46 # Check host 11:21:46 if assert_same_host and not self.is_same_host(url): 11:21:46 raise HostChangedError(self, url, retries) 11:21:46 11:21:46 # Ensure that the URL we're connecting to is properly encoded 11:21:46 if url.startswith("/"): 11:21:46 url = to_str(_encode_target(url)) 11:21:46 else: 11:21:46 url = to_str(parsed_url.url) 11:21:46 11:21:46 conn = None 11:21:46 11:21:46 # Track whether `conn` needs to be released before 11:21:46 # returning/raising/recursing. Update this variable if necessary, and 11:21:46 # leave `release_conn` constant throughout the function. That way, if 11:21:46 # the function recurses, the original value of `release_conn` will be 11:21:46 # passed down into the recursive call, and its value will be respected. 11:21:46 # 11:21:46 # See issue #651 [1] for details. 11:21:46 # 11:21:46 # [1] 11:21:46 release_this_conn = release_conn 11:21:46 11:21:46 http_tunnel_required = connection_requires_http_tunnel( 11:21:46 self.proxy, self.proxy_config, destination_scheme 11:21:46 ) 11:21:46 11:21:46 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:46 # have to copy the headers dict so we can safely change it without those 11:21:46 # changes being reflected in anyone else's copy. 11:21:46 if not http_tunnel_required: 11:21:46 headers = headers.copy() # type: ignore[attr-defined] 11:21:46 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:46 11:21:46 # Must keep the exception bound to a separate variable or else Python 3 11:21:46 # complains about UnboundLocalError. 11:21:46 err = None 11:21:46 11:21:46 # Keep track of whether we cleanly exited the except block. This 11:21:46 # ensures we do proper cleanup in finally. 11:21:46 clean_exit = False 11:21:46 11:21:46 # Rewind body position, if needed. Record current position 11:21:46 # for future rewinds in the event of a redirect/retry. 11:21:46 body_pos = set_file_position(body, body_pos) 11:21:46 11:21:46 try: 11:21:46 # Request a connection from the queue. 11:21:46 timeout_obj = self._get_timeout(timeout) 11:21:46 conn = self._get_conn(timeout=pool_timeout) 11:21:46 11:21:46 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:46 11:21:46 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:46 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:46 try: 11:21:46 self._prepare_proxy(conn) 11:21:46 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:46 self._raise_timeout( 11:21:46 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:46 ) 11:21:46 raise 11:21:46 11:21:46 # If we're going to release the connection in ``finally:``, then 11:21:46 # the response doesn't need to know about the connection. Otherwise 11:21:46 # it will also try to release it and we'll have a double-release 11:21:46 # mess. 11:21:46 response_conn = conn if not release_conn else None 11:21:46 11:21:46 # Make the request on the HTTPConnection object 11:21:46 > response = self._make_request( 11:21:46 conn, 11:21:46 method, 11:21:46 url, 11:21:46 timeout=timeout_obj, 11:21:46 body=body, 11:21:46 headers=headers, 11:21:46 chunked=chunked, 11:21:46 retries=retries, 11:21:46 response_conn=response_conn, 11:21:46 preload_content=preload_content, 11:21:46 decode_content=decode_content, 11:21:46 **response_kw, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:46 conn.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:46 self.endheaders() 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:46 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:46 self.send(msg) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:46 self.connect() 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:46 self.sock = self._new_conn() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 except socket.gaierror as e: 11:21:46 raise NameResolutionError(self.host, self, e) from e 11:21:46 except SocketTimeout as e: 11:21:46 raise ConnectTimeoutError( 11:21:46 self, 11:21:46 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:46 ) from e 11:21:46 11:21:46 except OSError as e: 11:21:46 > raise NewConnectionError( 11:21:46 self, f"Failed to establish a new connection: {e}" 11:21:46 ) from e 11:21:46 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 > resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:46 retries = retries.increment( 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 method = 'GET' 11:21:46 url = '/rests/data/ietf-network:networks/network=openroadm-topology/node=XPDRA01-XPDR1?content=config' 11:21:46 response = None 11:21:46 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:46 _pool = 11:21:46 _stacktrace = 11:21:46 11:21:46 def increment( 11:21:46 self, 11:21:46 method: str | None = None, 11:21:46 url: str | None = None, 11:21:46 response: BaseHTTPResponse | None = None, 11:21:46 error: Exception | None = None, 11:21:46 _pool: ConnectionPool | None = None, 11:21:46 _stacktrace: TracebackType | None = None, 11:21:46 ) -> Self: 11:21:46 """Return a new Retry object with incremented retry counters. 11:21:46 11:21:46 :param response: A response object, or None, if the server did not 11:21:46 return a response. 11:21:46 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:46 :param Exception error: An error encountered during the request, or 11:21:46 None if the response was received successfully. 11:21:46 11:21:46 :return: A new ``Retry`` object. 11:21:46 """ 11:21:46 if self.total is False and error: 11:21:46 # Disabled, indicate to re-raise the error. 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 11:21:46 total = self.total 11:21:46 if total is not None: 11:21:46 total -= 1 11:21:46 11:21:46 connect = self.connect 11:21:46 read = self.read 11:21:46 redirect = self.redirect 11:21:46 status_count = self.status 11:21:46 other = self.other 11:21:46 cause = "unknown" 11:21:46 status = None 11:21:46 redirect_location = None 11:21:46 11:21:46 if error and self._is_connection_error(error): 11:21:46 # Connect retry? 11:21:46 if connect is False: 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif connect is not None: 11:21:46 connect -= 1 11:21:46 11:21:46 elif error and self._is_read_error(error): 11:21:46 # Read retry? 11:21:46 if read is False or method is None or not self._is_method_retryable(method): 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif read is not None: 11:21:46 read -= 1 11:21:46 11:21:46 elif error: 11:21:46 # Other retry? 11:21:46 if other is not None: 11:21:46 other -= 1 11:21:46 11:21:46 elif response and response.get_redirect_location(): 11:21:46 # Redirect retry? 11:21:46 if redirect is not None: 11:21:46 redirect -= 1 11:21:46 cause = "too many redirects" 11:21:46 response_redirect_location = response.get_redirect_location() 11:21:46 if response_redirect_location: 11:21:46 redirect_location = response_redirect_location 11:21:46 status = response.status 11:21:46 11:21:46 else: 11:21:46 # Incrementing because of a server error like a 500 in 11:21:46 # status_forcelist and the given method is in the allowed_methods 11:21:46 cause = ResponseError.GENERIC_ERROR 11:21:46 if response and response.status: 11:21:46 if status_count is not None: 11:21:46 status_count -= 1 11:21:46 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:46 status = response.status 11:21:46 11:21:46 history = self.history + ( 11:21:46 RequestHistory(method, url, error, status, redirect_location), 11:21:46 ) 11:21:46 11:21:46 new_retry = self.new( 11:21:46 total=total, 11:21:46 connect=connect, 11:21:46 read=read, 11:21:46 redirect=redirect, 11:21:46 status=status_count, 11:21:46 other=other, 11:21:46 history=history, 11:21:46 ) 11:21:46 11:21:46 if new_retry.is_exhausted(): 11:21:46 reason = error or ResponseError(cause) 11:21:46 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:46 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/node=XPDRA01-XPDR1?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_25_check_topo_XPDRA(self): 11:21:46 > response = test_utils.get_ietf_network_node_request('openroadm-topology', 'XPDRA01-XPDR1', 'config') 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:356: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:583: in get_ietf_network_node_request 11:21:46 response = get_request(url[RESTCONF_VERSION].format(*format_args)) 11:21:46 transportpce_tests/common/test_utils.py:116: in get_request 11:21:46 return requests.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:46 return session.request(method=method, url=url, **kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:46 resp = self.send(prep, **send_kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:46 r = adapter.send(request, **kwargs) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 except (ProtocolError, OSError) as err: 11:21:46 raise ConnectionError(err, request=request) 11:21:46 11:21:46 except MaxRetryError as e: 11:21:46 if isinstance(e.reason, ConnectTimeoutError): 11:21:46 # TODO: Remove this in 3.0.0: see #2811 11:21:46 if not isinstance(e.reason, NewConnectionError): 11:21:46 raise ConnectTimeout(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, ResponseError): 11:21:46 raise RetryError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _ProxyError): 11:21:46 raise ProxyError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _SSLError): 11:21:46 # This branch is for urllib3 v1.22 and later. 11:21:46 raise SSLError(e, request=request) 11:21:46 11:21:46 > raise ConnectionError(e, request=request) 11:21:46 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/node=XPDRA01-XPDR1?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_25_check_topo_XPDRA 11:21:46 ____________ TransportPCEFulltesting.test_26_check_topo_ROADMA_SRG1 ____________ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 > sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:46 raise err 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:46 socket_options = [(6, 1, 1)] 11:21:46 11:21:46 def create_connection( 11:21:46 address: tuple[str, int], 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 source_address: tuple[str, int] | None = None, 11:21:46 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:46 ) -> socket.socket: 11:21:46 """Connect to *address* and return the socket object. 11:21:46 11:21:46 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:46 port)``) and return the socket object. Passing the optional 11:21:46 *timeout* parameter will set the timeout on the socket instance 11:21:46 before attempting to connect. If no *timeout* is supplied, the 11:21:46 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:46 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:46 for the socket to bind as a source address before making the connection. 11:21:46 An host of '' or port 0 tells the OS to use the default. 11:21:46 """ 11:21:46 11:21:46 host, port = address 11:21:46 if host.startswith("["): 11:21:46 host = host.strip("[]") 11:21:46 err = None 11:21:46 11:21:46 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:46 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:46 # The original create_connection function always returns all records. 11:21:46 family = allowed_gai_family() 11:21:46 11:21:46 try: 11:21:46 host.encode("idna") 11:21:46 except UnicodeError: 11:21:46 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:46 11:21:46 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:46 af, socktype, proto, canonname, sa = res 11:21:46 sock = None 11:21:46 try: 11:21:46 sock = socket.socket(af, socktype, proto) 11:21:46 11:21:46 # If provided, set socket level options before connecting. 11:21:46 _set_socket_options(sock, socket_options) 11:21:46 11:21:46 if timeout is not _DEFAULT_TIMEOUT: 11:21:46 sock.settimeout(timeout) 11:21:46 if source_address: 11:21:46 sock.bind(source_address) 11:21:46 > sock.connect(sa) 11:21:46 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 method = 'GET' 11:21:46 url = '/rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-SRG1?content=config' 11:21:46 body = None 11:21:46 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:46 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 redirect = False, assert_same_host = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:46 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:46 decode_content = False, response_kw = {} 11:21:46 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-SRG1', query='content=config', fragment=None) 11:21:46 destination_scheme = None, conn = None, release_this_conn = True 11:21:46 http_tunnel_required = False, err = None, clean_exit = False 11:21:46 11:21:46 def urlopen( # type: ignore[override] 11:21:46 self, 11:21:46 method: str, 11:21:46 url: str, 11:21:46 body: _TYPE_BODY | None = None, 11:21:46 headers: typing.Mapping[str, str] | None = None, 11:21:46 retries: Retry | bool | int | None = None, 11:21:46 redirect: bool = True, 11:21:46 assert_same_host: bool = True, 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 pool_timeout: int | None = None, 11:21:46 release_conn: bool | None = None, 11:21:46 chunked: bool = False, 11:21:46 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:46 preload_content: bool = True, 11:21:46 decode_content: bool = True, 11:21:46 **response_kw: typing.Any, 11:21:46 ) -> BaseHTTPResponse: 11:21:46 """ 11:21:46 Get a connection from the pool and perform an HTTP request. This is the 11:21:46 lowest level call for making a request, so you'll need to specify all 11:21:46 the raw details. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 More commonly, it's appropriate to use a convenience method 11:21:46 such as :meth:`request`. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 `release_conn` will only behave as expected if 11:21:46 `preload_content=False` because we want to make 11:21:46 `preload_content=False` the default behaviour someday soon without 11:21:46 breaking backwards compatibility. 11:21:46 11:21:46 :param method: 11:21:46 HTTP request method (such as GET, POST, PUT, etc.) 11:21:46 11:21:46 :param url: 11:21:46 The URL to perform the request on. 11:21:46 11:21:46 :param body: 11:21:46 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:46 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:46 11:21:46 :param headers: 11:21:46 Dictionary of custom headers to send, such as User-Agent, 11:21:46 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:46 these headers completely replace any pool-specific headers. 11:21:46 11:21:46 :param retries: 11:21:46 Configure the number of retries to allow before raising a 11:21:46 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:46 11:21:46 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:46 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:46 over different types of retries. 11:21:46 Pass an integer number to retry connection errors that many times, 11:21:46 but no other types of errors. Pass zero to never retry. 11:21:46 11:21:46 If ``False``, then retries are disabled and any exception is raised 11:21:46 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:46 the redirect response will be returned. 11:21:46 11:21:46 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:46 11:21:46 :param redirect: 11:21:46 If True, automatically handle redirects (status codes 301, 302, 11:21:46 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:46 will disable redirect, too. 11:21:46 11:21:46 :param assert_same_host: 11:21:46 If ``True``, will make sure that the host of the pool requests is 11:21:46 consistent else will raise HostChangedError. When ``False``, you can 11:21:46 use the pool on an HTTP proxy and request foreign hosts. 11:21:46 11:21:46 :param timeout: 11:21:46 If specified, overrides the default timeout for this one 11:21:46 request. It may be a float (in seconds) or an instance of 11:21:46 :class:`urllib3.util.Timeout`. 11:21:46 11:21:46 :param pool_timeout: 11:21:46 If set and the pool is set to block=True, then this method will 11:21:46 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:46 connection is available within the time period. 11:21:46 11:21:46 :param bool preload_content: 11:21:46 If True, the response's body will be preloaded into memory. 11:21:46 11:21:46 :param bool decode_content: 11:21:46 If True, will attempt to decode the body based on the 11:21:46 'content-encoding' header. 11:21:46 11:21:46 :param release_conn: 11:21:46 If False, then the urlopen call will not release the connection 11:21:46 back into the pool once a response is received (but will release if 11:21:46 you read the entire contents of the response such as when 11:21:46 `preload_content=True`). This is useful if you're not preloading 11:21:46 the response's content immediately. You will need to call 11:21:46 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:46 back into the pool. If None, it takes the value of ``preload_content`` 11:21:46 which defaults to ``True``. 11:21:46 11:21:46 :param bool chunked: 11:21:46 If True, urllib3 will send the body using chunked transfer 11:21:46 encoding. Otherwise, urllib3 will send the body using the standard 11:21:46 content-length form. Defaults to False. 11:21:46 11:21:46 :param int body_pos: 11:21:46 Position to seek to in file-like body in the event of a retry or 11:21:46 redirect. Typically this won't need to be set because urllib3 will 11:21:46 auto-populate the value when needed. 11:21:46 """ 11:21:46 parsed_url = parse_url(url) 11:21:46 destination_scheme = parsed_url.scheme 11:21:46 11:21:46 if headers is None: 11:21:46 headers = self.headers 11:21:46 11:21:46 if not isinstance(retries, Retry): 11:21:46 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:46 11:21:46 if release_conn is None: 11:21:46 release_conn = preload_content 11:21:46 11:21:46 # Check host 11:21:46 if assert_same_host and not self.is_same_host(url): 11:21:46 raise HostChangedError(self, url, retries) 11:21:46 11:21:46 # Ensure that the URL we're connecting to is properly encoded 11:21:46 if url.startswith("/"): 11:21:46 url = to_str(_encode_target(url)) 11:21:46 else: 11:21:46 url = to_str(parsed_url.url) 11:21:46 11:21:46 conn = None 11:21:46 11:21:46 # Track whether `conn` needs to be released before 11:21:46 # returning/raising/recursing. Update this variable if necessary, and 11:21:46 # leave `release_conn` constant throughout the function. That way, if 11:21:46 # the function recurses, the original value of `release_conn` will be 11:21:46 # passed down into the recursive call, and its value will be respected. 11:21:46 # 11:21:46 # See issue #651 [1] for details. 11:21:46 # 11:21:46 # [1] 11:21:46 release_this_conn = release_conn 11:21:46 11:21:46 http_tunnel_required = connection_requires_http_tunnel( 11:21:46 self.proxy, self.proxy_config, destination_scheme 11:21:46 ) 11:21:46 11:21:46 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:46 # have to copy the headers dict so we can safely change it without those 11:21:46 # changes being reflected in anyone else's copy. 11:21:46 if not http_tunnel_required: 11:21:46 headers = headers.copy() # type: ignore[attr-defined] 11:21:46 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:46 11:21:46 # Must keep the exception bound to a separate variable or else Python 3 11:21:46 # complains about UnboundLocalError. 11:21:46 err = None 11:21:46 11:21:46 # Keep track of whether we cleanly exited the except block. This 11:21:46 # ensures we do proper cleanup in finally. 11:21:46 clean_exit = False 11:21:46 11:21:46 # Rewind body position, if needed. Record current position 11:21:46 # for future rewinds in the event of a redirect/retry. 11:21:46 body_pos = set_file_position(body, body_pos) 11:21:46 11:21:46 try: 11:21:46 # Request a connection from the queue. 11:21:46 timeout_obj = self._get_timeout(timeout) 11:21:46 conn = self._get_conn(timeout=pool_timeout) 11:21:46 11:21:46 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:46 11:21:46 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:46 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:46 try: 11:21:46 self._prepare_proxy(conn) 11:21:46 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:46 self._raise_timeout( 11:21:46 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:46 ) 11:21:46 raise 11:21:46 11:21:46 # If we're going to release the connection in ``finally:``, then 11:21:46 # the response doesn't need to know about the connection. Otherwise 11:21:46 # it will also try to release it and we'll have a double-release 11:21:46 # mess. 11:21:46 response_conn = conn if not release_conn else None 11:21:46 11:21:46 # Make the request on the HTTPConnection object 11:21:46 > response = self._make_request( 11:21:46 conn, 11:21:46 method, 11:21:46 url, 11:21:46 timeout=timeout_obj, 11:21:46 body=body, 11:21:46 headers=headers, 11:21:46 chunked=chunked, 11:21:46 retries=retries, 11:21:46 response_conn=response_conn, 11:21:46 preload_content=preload_content, 11:21:46 decode_content=decode_content, 11:21:46 **response_kw, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:46 conn.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:46 self.endheaders() 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:46 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:46 self.send(msg) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:46 self.connect() 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:46 self.sock = self._new_conn() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 except socket.gaierror as e: 11:21:46 raise NameResolutionError(self.host, self, e) from e 11:21:46 except SocketTimeout as e: 11:21:46 raise ConnectTimeoutError( 11:21:46 self, 11:21:46 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:46 ) from e 11:21:46 11:21:46 except OSError as e: 11:21:46 > raise NewConnectionError( 11:21:46 self, f"Failed to establish a new connection: {e}" 11:21:46 ) from e 11:21:46 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 > resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:46 retries = retries.increment( 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 method = 'GET' 11:21:46 url = '/rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-SRG1?content=config' 11:21:46 response = None 11:21:46 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:46 _pool = 11:21:46 _stacktrace = 11:21:46 11:21:46 def increment( 11:21:46 self, 11:21:46 method: str | None = None, 11:21:46 url: str | None = None, 11:21:46 response: BaseHTTPResponse | None = None, 11:21:46 error: Exception | None = None, 11:21:46 _pool: ConnectionPool | None = None, 11:21:46 _stacktrace: TracebackType | None = None, 11:21:46 ) -> Self: 11:21:46 """Return a new Retry object with incremented retry counters. 11:21:46 11:21:46 :param response: A response object, or None, if the server did not 11:21:46 return a response. 11:21:46 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:46 :param Exception error: An error encountered during the request, or 11:21:46 None if the response was received successfully. 11:21:46 11:21:46 :return: A new ``Retry`` object. 11:21:46 """ 11:21:46 if self.total is False and error: 11:21:46 # Disabled, indicate to re-raise the error. 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 11:21:46 total = self.total 11:21:46 if total is not None: 11:21:46 total -= 1 11:21:46 11:21:46 connect = self.connect 11:21:46 read = self.read 11:21:46 redirect = self.redirect 11:21:46 status_count = self.status 11:21:46 other = self.other 11:21:46 cause = "unknown" 11:21:46 status = None 11:21:46 redirect_location = None 11:21:46 11:21:46 if error and self._is_connection_error(error): 11:21:46 # Connect retry? 11:21:46 if connect is False: 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif connect is not None: 11:21:46 connect -= 1 11:21:46 11:21:46 elif error and self._is_read_error(error): 11:21:46 # Read retry? 11:21:46 if read is False or method is None or not self._is_method_retryable(method): 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif read is not None: 11:21:46 read -= 1 11:21:46 11:21:46 elif error: 11:21:46 # Other retry? 11:21:46 if other is not None: 11:21:46 other -= 1 11:21:46 11:21:46 elif response and response.get_redirect_location(): 11:21:46 # Redirect retry? 11:21:46 if redirect is not None: 11:21:46 redirect -= 1 11:21:46 cause = "too many redirects" 11:21:46 response_redirect_location = response.get_redirect_location() 11:21:46 if response_redirect_location: 11:21:46 redirect_location = response_redirect_location 11:21:46 status = response.status 11:21:46 11:21:46 else: 11:21:46 # Incrementing because of a server error like a 500 in 11:21:46 # status_forcelist and the given method is in the allowed_methods 11:21:46 cause = ResponseError.GENERIC_ERROR 11:21:46 if response and response.status: 11:21:46 if status_count is not None: 11:21:46 status_count -= 1 11:21:46 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:46 status = response.status 11:21:46 11:21:46 history = self.history + ( 11:21:46 RequestHistory(method, url, error, status, redirect_location), 11:21:46 ) 11:21:46 11:21:46 new_retry = self.new( 11:21:46 total=total, 11:21:46 connect=connect, 11:21:46 read=read, 11:21:46 redirect=redirect, 11:21:46 status=status_count, 11:21:46 other=other, 11:21:46 history=history, 11:21:46 ) 11:21:46 11:21:46 if new_retry.is_exhausted(): 11:21:46 reason = error or ResponseError(cause) 11:21:46 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:46 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-SRG1?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_26_check_topo_ROADMA_SRG1(self): 11:21:46 > response = test_utils.get_ietf_network_node_request('openroadm-topology', 'ROADMA01-SRG1', 'config') 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:379: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:583: in get_ietf_network_node_request 11:21:46 response = get_request(url[RESTCONF_VERSION].format(*format_args)) 11:21:46 transportpce_tests/common/test_utils.py:116: in get_request 11:21:46 return requests.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:46 return session.request(method=method, url=url, **kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:46 resp = self.send(prep, **send_kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:46 r = adapter.send(request, **kwargs) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 except (ProtocolError, OSError) as err: 11:21:46 raise ConnectionError(err, request=request) 11:21:46 11:21:46 except MaxRetryError as e: 11:21:46 if isinstance(e.reason, ConnectTimeoutError): 11:21:46 # TODO: Remove this in 3.0.0: see #2811 11:21:46 if not isinstance(e.reason, NewConnectionError): 11:21:46 raise ConnectTimeout(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, ResponseError): 11:21:46 raise RetryError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _ProxyError): 11:21:46 raise ProxyError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _SSLError): 11:21:46 # This branch is for urllib3 v1.22 and later. 11:21:46 raise SSLError(e, request=request) 11:21:46 11:21:46 > raise ConnectionError(e, request=request) 11:21:46 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-SRG1?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_26_check_topo_ROADMA_SRG1 11:21:46 ____________ TransportPCEFulltesting.test_27_check_topo_ROADMA_DEG1 ____________ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 > sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:46 raise err 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:46 socket_options = [(6, 1, 1)] 11:21:46 11:21:46 def create_connection( 11:21:46 address: tuple[str, int], 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 source_address: tuple[str, int] | None = None, 11:21:46 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:46 ) -> socket.socket: 11:21:46 """Connect to *address* and return the socket object. 11:21:46 11:21:46 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:46 port)``) and return the socket object. Passing the optional 11:21:46 *timeout* parameter will set the timeout on the socket instance 11:21:46 before attempting to connect. If no *timeout* is supplied, the 11:21:46 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:46 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:46 for the socket to bind as a source address before making the connection. 11:21:46 An host of '' or port 0 tells the OS to use the default. 11:21:46 """ 11:21:46 11:21:46 host, port = address 11:21:46 if host.startswith("["): 11:21:46 host = host.strip("[]") 11:21:46 err = None 11:21:46 11:21:46 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:46 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:46 # The original create_connection function always returns all records. 11:21:46 family = allowed_gai_family() 11:21:46 11:21:46 try: 11:21:46 host.encode("idna") 11:21:46 except UnicodeError: 11:21:46 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:46 11:21:46 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:46 af, socktype, proto, canonname, sa = res 11:21:46 sock = None 11:21:46 try: 11:21:46 sock = socket.socket(af, socktype, proto) 11:21:46 11:21:46 # If provided, set socket level options before connecting. 11:21:46 _set_socket_options(sock, socket_options) 11:21:46 11:21:46 if timeout is not _DEFAULT_TIMEOUT: 11:21:46 sock.settimeout(timeout) 11:21:46 if source_address: 11:21:46 sock.bind(source_address) 11:21:46 > sock.connect(sa) 11:21:46 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 method = 'GET' 11:21:46 url = '/rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-DEG1?content=config' 11:21:46 body = None 11:21:46 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:46 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 redirect = False, assert_same_host = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:46 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:46 decode_content = False, response_kw = {} 11:21:46 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-DEG1', query='content=config', fragment=None) 11:21:46 destination_scheme = None, conn = None, release_this_conn = True 11:21:46 http_tunnel_required = False, err = None, clean_exit = False 11:21:46 11:21:46 def urlopen( # type: ignore[override] 11:21:46 self, 11:21:46 method: str, 11:21:46 url: str, 11:21:46 body: _TYPE_BODY | None = None, 11:21:46 headers: typing.Mapping[str, str] | None = None, 11:21:46 retries: Retry | bool | int | None = None, 11:21:46 redirect: bool = True, 11:21:46 assert_same_host: bool = True, 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 pool_timeout: int | None = None, 11:21:46 release_conn: bool | None = None, 11:21:46 chunked: bool = False, 11:21:46 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:46 preload_content: bool = True, 11:21:46 decode_content: bool = True, 11:21:46 **response_kw: typing.Any, 11:21:46 ) -> BaseHTTPResponse: 11:21:46 """ 11:21:46 Get a connection from the pool and perform an HTTP request. This is the 11:21:46 lowest level call for making a request, so you'll need to specify all 11:21:46 the raw details. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 More commonly, it's appropriate to use a convenience method 11:21:46 such as :meth:`request`. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 `release_conn` will only behave as expected if 11:21:46 `preload_content=False` because we want to make 11:21:46 `preload_content=False` the default behaviour someday soon without 11:21:46 breaking backwards compatibility. 11:21:46 11:21:46 :param method: 11:21:46 HTTP request method (such as GET, POST, PUT, etc.) 11:21:46 11:21:46 :param url: 11:21:46 The URL to perform the request on. 11:21:46 11:21:46 :param body: 11:21:46 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:46 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:46 11:21:46 :param headers: 11:21:46 Dictionary of custom headers to send, such as User-Agent, 11:21:46 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:46 these headers completely replace any pool-specific headers. 11:21:46 11:21:46 :param retries: 11:21:46 Configure the number of retries to allow before raising a 11:21:46 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:46 11:21:46 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:46 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:46 over different types of retries. 11:21:46 Pass an integer number to retry connection errors that many times, 11:21:46 but no other types of errors. Pass zero to never retry. 11:21:46 11:21:46 If ``False``, then retries are disabled and any exception is raised 11:21:46 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:46 the redirect response will be returned. 11:21:46 11:21:46 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:46 11:21:46 :param redirect: 11:21:46 If True, automatically handle redirects (status codes 301, 302, 11:21:46 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:46 will disable redirect, too. 11:21:46 11:21:46 :param assert_same_host: 11:21:46 If ``True``, will make sure that the host of the pool requests is 11:21:46 consistent else will raise HostChangedError. When ``False``, you can 11:21:46 use the pool on an HTTP proxy and request foreign hosts. 11:21:46 11:21:46 :param timeout: 11:21:46 If specified, overrides the default timeout for this one 11:21:46 request. It may be a float (in seconds) or an instance of 11:21:46 :class:`urllib3.util.Timeout`. 11:21:46 11:21:46 :param pool_timeout: 11:21:46 If set and the pool is set to block=True, then this method will 11:21:46 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:46 connection is available within the time period. 11:21:46 11:21:46 :param bool preload_content: 11:21:46 If True, the response's body will be preloaded into memory. 11:21:46 11:21:46 :param bool decode_content: 11:21:46 If True, will attempt to decode the body based on the 11:21:46 'content-encoding' header. 11:21:46 11:21:46 :param release_conn: 11:21:46 If False, then the urlopen call will not release the connection 11:21:46 back into the pool once a response is received (but will release if 11:21:46 you read the entire contents of the response such as when 11:21:46 `preload_content=True`). This is useful if you're not preloading 11:21:46 the response's content immediately. You will need to call 11:21:46 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:46 back into the pool. If None, it takes the value of ``preload_content`` 11:21:46 which defaults to ``True``. 11:21:46 11:21:46 :param bool chunked: 11:21:46 If True, urllib3 will send the body using chunked transfer 11:21:46 encoding. Otherwise, urllib3 will send the body using the standard 11:21:46 content-length form. Defaults to False. 11:21:46 11:21:46 :param int body_pos: 11:21:46 Position to seek to in file-like body in the event of a retry or 11:21:46 redirect. Typically this won't need to be set because urllib3 will 11:21:46 auto-populate the value when needed. 11:21:46 """ 11:21:46 parsed_url = parse_url(url) 11:21:46 destination_scheme = parsed_url.scheme 11:21:46 11:21:46 if headers is None: 11:21:46 headers = self.headers 11:21:46 11:21:46 if not isinstance(retries, Retry): 11:21:46 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:46 11:21:46 if release_conn is None: 11:21:46 release_conn = preload_content 11:21:46 11:21:46 # Check host 11:21:46 if assert_same_host and not self.is_same_host(url): 11:21:46 raise HostChangedError(self, url, retries) 11:21:46 11:21:46 # Ensure that the URL we're connecting to is properly encoded 11:21:46 if url.startswith("/"): 11:21:46 url = to_str(_encode_target(url)) 11:21:46 else: 11:21:46 url = to_str(parsed_url.url) 11:21:46 11:21:46 conn = None 11:21:46 11:21:46 # Track whether `conn` needs to be released before 11:21:46 # returning/raising/recursing. Update this variable if necessary, and 11:21:46 # leave `release_conn` constant throughout the function. That way, if 11:21:46 # the function recurses, the original value of `release_conn` will be 11:21:46 # passed down into the recursive call, and its value will be respected. 11:21:46 # 11:21:46 # See issue #651 [1] for details. 11:21:46 # 11:21:46 # [1] 11:21:46 release_this_conn = release_conn 11:21:46 11:21:46 http_tunnel_required = connection_requires_http_tunnel( 11:21:46 self.proxy, self.proxy_config, destination_scheme 11:21:46 ) 11:21:46 11:21:46 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:46 # have to copy the headers dict so we can safely change it without those 11:21:46 # changes being reflected in anyone else's copy. 11:21:46 if not http_tunnel_required: 11:21:46 headers = headers.copy() # type: ignore[attr-defined] 11:21:46 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:46 11:21:46 # Must keep the exception bound to a separate variable or else Python 3 11:21:46 # complains about UnboundLocalError. 11:21:46 err = None 11:21:46 11:21:46 # Keep track of whether we cleanly exited the except block. This 11:21:46 # ensures we do proper cleanup in finally. 11:21:46 clean_exit = False 11:21:46 11:21:46 # Rewind body position, if needed. Record current position 11:21:46 # for future rewinds in the event of a redirect/retry. 11:21:46 body_pos = set_file_position(body, body_pos) 11:21:46 11:21:46 try: 11:21:46 # Request a connection from the queue. 11:21:46 timeout_obj = self._get_timeout(timeout) 11:21:46 conn = self._get_conn(timeout=pool_timeout) 11:21:46 11:21:46 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:46 11:21:46 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:46 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:46 try: 11:21:46 self._prepare_proxy(conn) 11:21:46 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:46 self._raise_timeout( 11:21:46 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:46 ) 11:21:46 raise 11:21:46 11:21:46 # If we're going to release the connection in ``finally:``, then 11:21:46 # the response doesn't need to know about the connection. Otherwise 11:21:46 # it will also try to release it and we'll have a double-release 11:21:46 # mess. 11:21:46 response_conn = conn if not release_conn else None 11:21:46 11:21:46 # Make the request on the HTTPConnection object 11:21:46 > response = self._make_request( 11:21:46 conn, 11:21:46 method, 11:21:46 url, 11:21:46 timeout=timeout_obj, 11:21:46 body=body, 11:21:46 headers=headers, 11:21:46 chunked=chunked, 11:21:46 retries=retries, 11:21:46 response_conn=response_conn, 11:21:46 preload_content=preload_content, 11:21:46 decode_content=decode_content, 11:21:46 **response_kw, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:46 conn.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:46 self.endheaders() 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:46 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:46 self.send(msg) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:46 self.connect() 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:46 self.sock = self._new_conn() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 except socket.gaierror as e: 11:21:46 raise NameResolutionError(self.host, self, e) from e 11:21:46 except SocketTimeout as e: 11:21:46 raise ConnectTimeoutError( 11:21:46 self, 11:21:46 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:46 ) from e 11:21:46 11:21:46 except OSError as e: 11:21:46 > raise NewConnectionError( 11:21:46 self, f"Failed to establish a new connection: {e}" 11:21:46 ) from e 11:21:46 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 > resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:46 retries = retries.increment( 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 method = 'GET' 11:21:46 url = '/rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-DEG1?content=config' 11:21:46 response = None 11:21:46 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:46 _pool = 11:21:46 _stacktrace = 11:21:46 11:21:46 def increment( 11:21:46 self, 11:21:46 method: str | None = None, 11:21:46 url: str | None = None, 11:21:46 response: BaseHTTPResponse | None = None, 11:21:46 error: Exception | None = None, 11:21:46 _pool: ConnectionPool | None = None, 11:21:46 _stacktrace: TracebackType | None = None, 11:21:46 ) -> Self: 11:21:46 """Return a new Retry object with incremented retry counters. 11:21:46 11:21:46 :param response: A response object, or None, if the server did not 11:21:46 return a response. 11:21:46 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:46 :param Exception error: An error encountered during the request, or 11:21:46 None if the response was received successfully. 11:21:46 11:21:46 :return: A new ``Retry`` object. 11:21:46 """ 11:21:46 if self.total is False and error: 11:21:46 # Disabled, indicate to re-raise the error. 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 11:21:46 total = self.total 11:21:46 if total is not None: 11:21:46 total -= 1 11:21:46 11:21:46 connect = self.connect 11:21:46 read = self.read 11:21:46 redirect = self.redirect 11:21:46 status_count = self.status 11:21:46 other = self.other 11:21:46 cause = "unknown" 11:21:46 status = None 11:21:46 redirect_location = None 11:21:46 11:21:46 if error and self._is_connection_error(error): 11:21:46 # Connect retry? 11:21:46 if connect is False: 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif connect is not None: 11:21:46 connect -= 1 11:21:46 11:21:46 elif error and self._is_read_error(error): 11:21:46 # Read retry? 11:21:46 if read is False or method is None or not self._is_method_retryable(method): 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif read is not None: 11:21:46 read -= 1 11:21:46 11:21:46 elif error: 11:21:46 # Other retry? 11:21:46 if other is not None: 11:21:46 other -= 1 11:21:46 11:21:46 elif response and response.get_redirect_location(): 11:21:46 # Redirect retry? 11:21:46 if redirect is not None: 11:21:46 redirect -= 1 11:21:46 cause = "too many redirects" 11:21:46 response_redirect_location = response.get_redirect_location() 11:21:46 if response_redirect_location: 11:21:46 redirect_location = response_redirect_location 11:21:46 status = response.status 11:21:46 11:21:46 else: 11:21:46 # Incrementing because of a server error like a 500 in 11:21:46 # status_forcelist and the given method is in the allowed_methods 11:21:46 cause = ResponseError.GENERIC_ERROR 11:21:46 if response and response.status: 11:21:46 if status_count is not None: 11:21:46 status_count -= 1 11:21:46 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:46 status = response.status 11:21:46 11:21:46 history = self.history + ( 11:21:46 RequestHistory(method, url, error, status, redirect_location), 11:21:46 ) 11:21:46 11:21:46 new_retry = self.new( 11:21:46 total=total, 11:21:46 connect=connect, 11:21:46 read=read, 11:21:46 redirect=redirect, 11:21:46 status=status_count, 11:21:46 other=other, 11:21:46 history=history, 11:21:46 ) 11:21:46 11:21:46 if new_retry.is_exhausted(): 11:21:46 reason = error or ResponseError(cause) 11:21:46 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:46 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-DEG1?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_27_check_topo_ROADMA_DEG1(self): 11:21:46 > response = test_utils.get_ietf_network_node_request('openroadm-topology', 'ROADMA01-DEG1', 'config') 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:405: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:583: in get_ietf_network_node_request 11:21:46 response = get_request(url[RESTCONF_VERSION].format(*format_args)) 11:21:46 transportpce_tests/common/test_utils.py:116: in get_request 11:21:46 return requests.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:46 return session.request(method=method, url=url, **kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:46 resp = self.send(prep, **send_kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:46 r = adapter.send(request, **kwargs) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 except (ProtocolError, OSError) as err: 11:21:46 raise ConnectionError(err, request=request) 11:21:46 11:21:46 except MaxRetryError as e: 11:21:46 if isinstance(e.reason, ConnectTimeoutError): 11:21:46 # TODO: Remove this in 3.0.0: see #2811 11:21:46 if not isinstance(e.reason, NewConnectionError): 11:21:46 raise ConnectTimeout(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, ResponseError): 11:21:46 raise RetryError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _ProxyError): 11:21:46 raise ProxyError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _SSLError): 11:21:46 # This branch is for urllib3 v1.22 and later. 11:21:46 raise SSLError(e, request=request) 11:21:46 11:21:46 > raise ConnectionError(e, request=request) 11:21:46 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-DEG1?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_27_check_topo_ROADMA_DEG1 11:21:46 _____________ TransportPCEFulltesting.test_28_create_eth_service3 ______________ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 > sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:46 raise err 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:46 socket_options = [(6, 1, 1)] 11:21:46 11:21:46 def create_connection( 11:21:46 address: tuple[str, int], 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 source_address: tuple[str, int] | None = None, 11:21:46 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:46 ) -> socket.socket: 11:21:46 """Connect to *address* and return the socket object. 11:21:46 11:21:46 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:46 port)``) and return the socket object. Passing the optional 11:21:46 *timeout* parameter will set the timeout on the socket instance 11:21:46 before attempting to connect. If no *timeout* is supplied, the 11:21:46 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:46 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:46 for the socket to bind as a source address before making the connection. 11:21:46 An host of '' or port 0 tells the OS to use the default. 11:21:46 """ 11:21:46 11:21:46 host, port = address 11:21:46 if host.startswith("["): 11:21:46 host = host.strip("[]") 11:21:46 err = None 11:21:46 11:21:46 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:46 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:46 # The original create_connection function always returns all records. 11:21:46 family = allowed_gai_family() 11:21:46 11:21:46 try: 11:21:46 host.encode("idna") 11:21:46 except UnicodeError: 11:21:46 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:46 11:21:46 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:46 af, socktype, proto, canonname, sa = res 11:21:46 sock = None 11:21:46 try: 11:21:46 sock = socket.socket(af, socktype, proto) 11:21:46 11:21:46 # If provided, set socket level options before connecting. 11:21:46 _set_socket_options(sock, socket_options) 11:21:46 11:21:46 if timeout is not _DEFAULT_TIMEOUT: 11:21:46 sock.settimeout(timeout) 11:21:46 if source_address: 11:21:46 sock.bind(source_address) 11:21:46 > sock.connect(sa) 11:21:46 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 method = 'POST', url = '/rests/operations/org-openroadm-service:service-create' 11:21:46 body = '{"input": {"sdnc-request-header": {"request-id": "e3028bae-a90f-4ddd-a83f-cf224eba0e58", "rpc-action": "service-creat...-direction": [{"index": 0}], "optic-type": "gray"}, "due-date": "2016-11-28T00:00:01Z", "operator-contact": "pw1234"}}' 11:21:46 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '784', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:46 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 redirect = False, assert_same_host = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:46 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:46 decode_content = False, response_kw = {} 11:21:46 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/org-openroadm-service:service-create', query=None, fragment=None) 11:21:46 destination_scheme = None, conn = None, release_this_conn = True 11:21:46 http_tunnel_required = False, err = None, clean_exit = False 11:21:46 11:21:46 def urlopen( # type: ignore[override] 11:21:46 self, 11:21:46 method: str, 11:21:46 url: str, 11:21:46 body: _TYPE_BODY | None = None, 11:21:46 headers: typing.Mapping[str, str] | None = None, 11:21:46 retries: Retry | bool | int | None = None, 11:21:46 redirect: bool = True, 11:21:46 assert_same_host: bool = True, 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 pool_timeout: int | None = None, 11:21:46 release_conn: bool | None = None, 11:21:46 chunked: bool = False, 11:21:46 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:46 preload_content: bool = True, 11:21:46 decode_content: bool = True, 11:21:46 **response_kw: typing.Any, 11:21:46 ) -> BaseHTTPResponse: 11:21:46 """ 11:21:46 Get a connection from the pool and perform an HTTP request. This is the 11:21:46 lowest level call for making a request, so you'll need to specify all 11:21:46 the raw details. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 More commonly, it's appropriate to use a convenience method 11:21:46 such as :meth:`request`. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 `release_conn` will only behave as expected if 11:21:46 `preload_content=False` because we want to make 11:21:46 `preload_content=False` the default behaviour someday soon without 11:21:46 breaking backwards compatibility. 11:21:46 11:21:46 :param method: 11:21:46 HTTP request method (such as GET, POST, PUT, etc.) 11:21:46 11:21:46 :param url: 11:21:46 The URL to perform the request on. 11:21:46 11:21:46 :param body: 11:21:46 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:46 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:46 11:21:46 :param headers: 11:21:46 Dictionary of custom headers to send, such as User-Agent, 11:21:46 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:46 these headers completely replace any pool-specific headers. 11:21:46 11:21:46 :param retries: 11:21:46 Configure the number of retries to allow before raising a 11:21:46 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:46 11:21:46 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:46 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:46 over different types of retries. 11:21:46 Pass an integer number to retry connection errors that many times, 11:21:46 but no other types of errors. Pass zero to never retry. 11:21:46 11:21:46 If ``False``, then retries are disabled and any exception is raised 11:21:46 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:46 the redirect response will be returned. 11:21:46 11:21:46 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:46 11:21:46 :param redirect: 11:21:46 If True, automatically handle redirects (status codes 301, 302, 11:21:46 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:46 will disable redirect, too. 11:21:46 11:21:46 :param assert_same_host: 11:21:46 If ``True``, will make sure that the host of the pool requests is 11:21:46 consistent else will raise HostChangedError. When ``False``, you can 11:21:46 use the pool on an HTTP proxy and request foreign hosts. 11:21:46 11:21:46 :param timeout: 11:21:46 If specified, overrides the default timeout for this one 11:21:46 request. It may be a float (in seconds) or an instance of 11:21:46 :class:`urllib3.util.Timeout`. 11:21:46 11:21:46 :param pool_timeout: 11:21:46 If set and the pool is set to block=True, then this method will 11:21:46 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:46 connection is available within the time period. 11:21:46 11:21:46 :param bool preload_content: 11:21:46 If True, the response's body will be preloaded into memory. 11:21:46 11:21:46 :param bool decode_content: 11:21:46 If True, will attempt to decode the body based on the 11:21:46 'content-encoding' header. 11:21:46 11:21:46 :param release_conn: 11:21:46 If False, then the urlopen call will not release the connection 11:21:46 back into the pool once a response is received (but will release if 11:21:46 you read the entire contents of the response such as when 11:21:46 `preload_content=True`). This is useful if you're not preloading 11:21:46 the response's content immediately. You will need to call 11:21:46 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:46 back into the pool. If None, it takes the value of ``preload_content`` 11:21:46 which defaults to ``True``. 11:21:46 11:21:46 :param bool chunked: 11:21:46 If True, urllib3 will send the body using chunked transfer 11:21:46 encoding. Otherwise, urllib3 will send the body using the standard 11:21:46 content-length form. Defaults to False. 11:21:46 11:21:46 :param int body_pos: 11:21:46 Position to seek to in file-like body in the event of a retry or 11:21:46 redirect. Typically this won't need to be set because urllib3 will 11:21:46 auto-populate the value when needed. 11:21:46 """ 11:21:46 parsed_url = parse_url(url) 11:21:46 destination_scheme = parsed_url.scheme 11:21:46 11:21:46 if headers is None: 11:21:46 headers = self.headers 11:21:46 11:21:46 if not isinstance(retries, Retry): 11:21:46 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:46 11:21:46 if release_conn is None: 11:21:46 release_conn = preload_content 11:21:46 11:21:46 # Check host 11:21:46 if assert_same_host and not self.is_same_host(url): 11:21:46 raise HostChangedError(self, url, retries) 11:21:46 11:21:46 # Ensure that the URL we're connecting to is properly encoded 11:21:46 if url.startswith("/"): 11:21:46 url = to_str(_encode_target(url)) 11:21:46 else: 11:21:46 url = to_str(parsed_url.url) 11:21:46 11:21:46 conn = None 11:21:46 11:21:46 # Track whether `conn` needs to be released before 11:21:46 # returning/raising/recursing. Update this variable if necessary, and 11:21:46 # leave `release_conn` constant throughout the function. That way, if 11:21:46 # the function recurses, the original value of `release_conn` will be 11:21:46 # passed down into the recursive call, and its value will be respected. 11:21:46 # 11:21:46 # See issue #651 [1] for details. 11:21:46 # 11:21:46 # [1] 11:21:46 release_this_conn = release_conn 11:21:46 11:21:46 http_tunnel_required = connection_requires_http_tunnel( 11:21:46 self.proxy, self.proxy_config, destination_scheme 11:21:46 ) 11:21:46 11:21:46 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:46 # have to copy the headers dict so we can safely change it without those 11:21:46 # changes being reflected in anyone else's copy. 11:21:46 if not http_tunnel_required: 11:21:46 headers = headers.copy() # type: ignore[attr-defined] 11:21:46 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:46 11:21:46 # Must keep the exception bound to a separate variable or else Python 3 11:21:46 # complains about UnboundLocalError. 11:21:46 err = None 11:21:46 11:21:46 # Keep track of whether we cleanly exited the except block. This 11:21:46 # ensures we do proper cleanup in finally. 11:21:46 clean_exit = False 11:21:46 11:21:46 # Rewind body position, if needed. Record current position 11:21:46 # for future rewinds in the event of a redirect/retry. 11:21:46 body_pos = set_file_position(body, body_pos) 11:21:46 11:21:46 try: 11:21:46 # Request a connection from the queue. 11:21:46 timeout_obj = self._get_timeout(timeout) 11:21:46 conn = self._get_conn(timeout=pool_timeout) 11:21:46 11:21:46 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:46 11:21:46 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:46 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:46 try: 11:21:46 self._prepare_proxy(conn) 11:21:46 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:46 self._raise_timeout( 11:21:46 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:46 ) 11:21:46 raise 11:21:46 11:21:46 # If we're going to release the connection in ``finally:``, then 11:21:46 # the response doesn't need to know about the connection. Otherwise 11:21:46 # it will also try to release it and we'll have a double-release 11:21:46 # mess. 11:21:46 response_conn = conn if not release_conn else None 11:21:46 11:21:46 # Make the request on the HTTPConnection object 11:21:46 > response = self._make_request( 11:21:46 conn, 11:21:46 method, 11:21:46 url, 11:21:46 timeout=timeout_obj, 11:21:46 body=body, 11:21:46 headers=headers, 11:21:46 chunked=chunked, 11:21:46 retries=retries, 11:21:46 response_conn=response_conn, 11:21:46 preload_content=preload_content, 11:21:46 decode_content=decode_content, 11:21:46 **response_kw, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:46 conn.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:46 self.endheaders() 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:46 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:46 self.send(msg) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:46 self.connect() 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:46 self.sock = self._new_conn() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 except socket.gaierror as e: 11:21:46 raise NameResolutionError(self.host, self, e) from e 11:21:46 except SocketTimeout as e: 11:21:46 raise ConnectTimeoutError( 11:21:46 self, 11:21:46 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:46 ) from e 11:21:46 11:21:46 except OSError as e: 11:21:46 > raise NewConnectionError( 11:21:46 self, f"Failed to establish a new connection: {e}" 11:21:46 ) from e 11:21:46 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 > resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:46 retries = retries.increment( 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 method = 'POST', url = '/rests/operations/org-openroadm-service:service-create' 11:21:46 response = None 11:21:46 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:46 _pool = 11:21:46 _stacktrace = 11:21:46 11:21:46 def increment( 11:21:46 self, 11:21:46 method: str | None = None, 11:21:46 url: str | None = None, 11:21:46 response: BaseHTTPResponse | None = None, 11:21:46 error: Exception | None = None, 11:21:46 _pool: ConnectionPool | None = None, 11:21:46 _stacktrace: TracebackType | None = None, 11:21:46 ) -> Self: 11:21:46 """Return a new Retry object with incremented retry counters. 11:21:46 11:21:46 :param response: A response object, or None, if the server did not 11:21:46 return a response. 11:21:46 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:46 :param Exception error: An error encountered during the request, or 11:21:46 None if the response was received successfully. 11:21:46 11:21:46 :return: A new ``Retry`` object. 11:21:46 """ 11:21:46 if self.total is False and error: 11:21:46 # Disabled, indicate to re-raise the error. 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 11:21:46 total = self.total 11:21:46 if total is not None: 11:21:46 total -= 1 11:21:46 11:21:46 connect = self.connect 11:21:46 read = self.read 11:21:46 redirect = self.redirect 11:21:46 status_count = self.status 11:21:46 other = self.other 11:21:46 cause = "unknown" 11:21:46 status = None 11:21:46 redirect_location = None 11:21:46 11:21:46 if error and self._is_connection_error(error): 11:21:46 # Connect retry? 11:21:46 if connect is False: 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif connect is not None: 11:21:46 connect -= 1 11:21:46 11:21:46 elif error and self._is_read_error(error): 11:21:46 # Read retry? 11:21:46 if read is False or method is None or not self._is_method_retryable(method): 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif read is not None: 11:21:46 read -= 1 11:21:46 11:21:46 elif error: 11:21:46 # Other retry? 11:21:46 if other is not None: 11:21:46 other -= 1 11:21:46 11:21:46 elif response and response.get_redirect_location(): 11:21:46 # Redirect retry? 11:21:46 if redirect is not None: 11:21:46 redirect -= 1 11:21:46 cause = "too many redirects" 11:21:46 response_redirect_location = response.get_redirect_location() 11:21:46 if response_redirect_location: 11:21:46 redirect_location = response_redirect_location 11:21:46 status = response.status 11:21:46 11:21:46 else: 11:21:46 # Incrementing because of a server error like a 500 in 11:21:46 # status_forcelist and the given method is in the allowed_methods 11:21:46 cause = ResponseError.GENERIC_ERROR 11:21:46 if response and response.status: 11:21:46 if status_count is not None: 11:21:46 status_count -= 1 11:21:46 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:46 status = response.status 11:21:46 11:21:46 history = self.history + ( 11:21:46 RequestHistory(method, url, error, status, redirect_location), 11:21:46 ) 11:21:46 11:21:46 new_retry = self.new( 11:21:46 total=total, 11:21:46 connect=connect, 11:21:46 read=read, 11:21:46 redirect=redirect, 11:21:46 status=status_count, 11:21:46 other=other, 11:21:46 history=history, 11:21:46 ) 11:21:46 11:21:46 if new_retry.is_exhausted(): 11:21:46 reason = error or ResponseError(cause) 11:21:46 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:46 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/org-openroadm-service:service-create (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_28_create_eth_service3(self): 11:21:46 self.cr_serv_input_data["service-name"] = "service3" 11:21:46 > response = test_utils.transportpce_api_rpc_request( 11:21:46 'org-openroadm-service', 'service-create', 11:21:46 self.cr_serv_input_data) 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:431: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:687: in transportpce_api_rpc_request 11:21:46 response = post_request(url, data) 11:21:46 transportpce_tests/common/test_utils.py:142: in post_request 11:21:46 return requests.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:46 return session.request(method=method, url=url, **kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:46 resp = self.send(prep, **send_kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:46 r = adapter.send(request, **kwargs) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 except (ProtocolError, OSError) as err: 11:21:46 raise ConnectionError(err, request=request) 11:21:46 11:21:46 except MaxRetryError as e: 11:21:46 if isinstance(e.reason, ConnectTimeoutError): 11:21:46 # TODO: Remove this in 3.0.0: see #2811 11:21:46 if not isinstance(e.reason, NewConnectionError): 11:21:46 raise ConnectTimeout(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, ResponseError): 11:21:46 raise RetryError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _ProxyError): 11:21:46 raise ProxyError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _SSLError): 11:21:46 # This branch is for urllib3 v1.22 and later. 11:21:46 raise SSLError(e, request=request) 11:21:46 11:21:46 > raise ConnectionError(e, request=request) 11:21:46 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/org-openroadm-service:service-create (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_28_create_eth_service3 11:21:46 _____________ TransportPCEFulltesting.test_29_delete_eth_service3 ______________ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 > return complexjson.loads(self.text, **kwargs) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:974: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:346: in loads 11:21:46 return _default_decoder.decode(s) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:337: in decode 11:21:46 obj, end = self.raw_decode(s, idx=_w(s, 0).end()) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , s = '', idx = 0 11:21:46 11:21:46 def raw_decode(self, s, idx=0): 11:21:46 """Decode a JSON document from ``s`` (a ``str`` beginning with 11:21:46 a JSON document) and return a 2-tuple of the Python 11:21:46 representation and the index in ``s`` where the document ended. 11:21:46 11:21:46 This can be used to decode a JSON document from a string that may 11:21:46 have extraneous data at the end. 11:21:46 11:21:46 """ 11:21:46 try: 11:21:46 obj, end = self.scan_once(s, idx) 11:21:46 except StopIteration as err: 11:21:46 > raise JSONDecodeError("Expecting value", s, err.value) from None 11:21:46 E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:355: JSONDecodeError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_29_delete_eth_service3(self): 11:21:46 self.del_serv_input_data["service-delete-req-info"]["service-name"] = "service3" 11:21:46 > response = test_utils.transportpce_api_rpc_request( 11:21:46 'org-openroadm-service', 'service-delete', 11:21:46 self.del_serv_input_data) 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:445: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:691: in transportpce_api_rpc_request 11:21:46 res = response.json() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 return complexjson.loads(self.text, **kwargs) 11:21:46 except JSONDecodeError as e: 11:21:46 # Catch JSON-related errors and raise as requests.JSONDecodeError 11:21:46 # This aliases json.JSONDecodeError and simplejson.JSONDecodeError 11:21:46 > raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 E requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:978: JSONDecodeError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_29_delete_eth_service3 11:21:46 _____________ TransportPCEFulltesting.test_30_delete_eth_service1 ______________ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 > return complexjson.loads(self.text, **kwargs) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:974: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:346: in loads 11:21:46 return _default_decoder.decode(s) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:337: in decode 11:21:46 obj, end = self.raw_decode(s, idx=_w(s, 0).end()) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , s = '', idx = 0 11:21:46 11:21:46 def raw_decode(self, s, idx=0): 11:21:46 """Decode a JSON document from ``s`` (a ``str`` beginning with 11:21:46 a JSON document) and return a 2-tuple of the Python 11:21:46 representation and the index in ``s`` where the document ended. 11:21:46 11:21:46 This can be used to decode a JSON document from a string that may 11:21:46 have extraneous data at the end. 11:21:46 11:21:46 """ 11:21:46 try: 11:21:46 obj, end = self.scan_once(s, idx) 11:21:46 except StopIteration as err: 11:21:46 > raise JSONDecodeError("Expecting value", s, err.value) from None 11:21:46 E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:355: JSONDecodeError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_30_delete_eth_service1(self): 11:21:46 self.del_serv_input_data["service-delete-req-info"]["service-name"] = "service1" 11:21:46 > response = test_utils.transportpce_api_rpc_request( 11:21:46 'org-openroadm-service', 'service-delete', 11:21:46 self.del_serv_input_data) 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:456: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:691: in transportpce_api_rpc_request 11:21:46 res = response.json() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 return complexjson.loads(self.text, **kwargs) 11:21:46 except JSONDecodeError as e: 11:21:46 # Catch JSON-related errors and raise as requests.JSONDecodeError 11:21:46 # This aliases json.JSONDecodeError and simplejson.JSONDecodeError 11:21:46 > raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 E requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:978: JSONDecodeError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_30_delete_eth_service1 11:21:46 _____________ TransportPCEFulltesting.test_31_delete_eth_service2 ______________ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 > return complexjson.loads(self.text, **kwargs) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:974: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:346: in loads 11:21:46 return _default_decoder.decode(s) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:337: in decode 11:21:46 obj, end = self.raw_decode(s, idx=_w(s, 0).end()) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , s = '', idx = 0 11:21:46 11:21:46 def raw_decode(self, s, idx=0): 11:21:46 """Decode a JSON document from ``s`` (a ``str`` beginning with 11:21:46 a JSON document) and return a 2-tuple of the Python 11:21:46 representation and the index in ``s`` where the document ended. 11:21:46 11:21:46 This can be used to decode a JSON document from a string that may 11:21:46 have extraneous data at the end. 11:21:46 11:21:46 """ 11:21:46 try: 11:21:46 obj, end = self.scan_once(s, idx) 11:21:46 except StopIteration as err: 11:21:46 > raise JSONDecodeError("Expecting value", s, err.value) from None 11:21:46 E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:355: JSONDecodeError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_31_delete_eth_service2(self): 11:21:46 self.del_serv_input_data["service-delete-req-info"]["service-name"] = "service2" 11:21:46 > response = test_utils.transportpce_api_rpc_request( 11:21:46 'org-openroadm-service', 'service-delete', 11:21:46 self.del_serv_input_data) 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:466: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:691: in transportpce_api_rpc_request 11:21:46 res = response.json() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 return complexjson.loads(self.text, **kwargs) 11:21:46 except JSONDecodeError as e: 11:21:46 # Catch JSON-related errors and raise as requests.JSONDecodeError 11:21:46 # This aliases json.JSONDecodeError and simplejson.JSONDecodeError 11:21:46 > raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 E requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:978: JSONDecodeError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_31_delete_eth_service2 11:21:46 ______________ TransportPCEFulltesting.test_32_check_no_xc_ROADMA ______________ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 > return complexjson.loads(self.text, **kwargs) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:974: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:346: in loads 11:21:46 return _default_decoder.decode(s) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:337: in decode 11:21:46 obj, end = self.raw_decode(s, idx=_w(s, 0).end()) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , s = '', idx = 0 11:21:46 11:21:46 def raw_decode(self, s, idx=0): 11:21:46 """Decode a JSON document from ``s`` (a ``str`` beginning with 11:21:46 a JSON document) and return a 2-tuple of the Python 11:21:46 representation and the index in ``s`` where the document ended. 11:21:46 11:21:46 This can be used to decode a JSON document from a string that may 11:21:46 have extraneous data at the end. 11:21:46 11:21:46 """ 11:21:46 try: 11:21:46 obj, end = self.scan_once(s, idx) 11:21:46 except StopIteration as err: 11:21:46 > raise JSONDecodeError("Expecting value", s, err.value) from None 11:21:46 E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:355: JSONDecodeError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_32_check_no_xc_ROADMA(self): 11:21:46 > response = test_utils.check_node_request("ROADMA01") 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:475: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:389: in check_node_request 11:21:46 res = response.json() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 return complexjson.loads(self.text, **kwargs) 11:21:46 except JSONDecodeError as e: 11:21:46 # Catch JSON-related errors and raise as requests.JSONDecodeError 11:21:46 # This aliases json.JSONDecodeError and simplejson.JSONDecodeError 11:21:46 > raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 E requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:978: JSONDecodeError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_32_check_no_xc_ROADMA 11:21:46 _______________ TransportPCEFulltesting.test_33_check_topo_XPDRA _______________ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_33_check_topo_XPDRA(self): 11:21:46 response = test_utils.get_ietf_network_node_request('openroadm-topology', 'XPDRA01-XPDR1', 'config') 11:21:46 > self.assertEqual(response['status_code'], requests.codes.ok) 11:21:46 E AssertionError: 404 != 200 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:483: AssertionError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_33_check_topo_XPDRA 11:21:46 ____________ TransportPCEFulltesting.test_34_check_topo_ROADMA_SRG1 ____________ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_34_check_topo_ROADMA_SRG1(self): 11:21:46 response = test_utils.get_ietf_network_node_request('openroadm-topology', 'ROADMA01-SRG1', 'config') 11:21:46 > self.assertEqual(response['status_code'], requests.codes.ok) 11:21:46 E AssertionError: 404 != 200 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:502: AssertionError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_34_check_topo_ROADMA_SRG1 11:21:46 ____________ TransportPCEFulltesting.test_35_check_topo_ROADMA_DEG1 ____________ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_35_check_topo_ROADMA_DEG1(self): 11:21:46 response = test_utils.get_ietf_network_node_request('openroadm-topology', 'ROADMA01-DEG1', 'config') 11:21:46 > self.assertEqual(response['status_code'], requests.codes.ok) 11:21:46 E AssertionError: 404 != 200 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:528: AssertionError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_35_check_topo_ROADMA_DEG1 11:21:46 ______________ TransportPCEFulltesting.test_36_create_oc_service1 ______________ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 > return complexjson.loads(self.text, **kwargs) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:974: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:346: in loads 11:21:46 return _default_decoder.decode(s) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:337: in decode 11:21:46 obj, end = self.raw_decode(s, idx=_w(s, 0).end()) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , s = '', idx = 0 11:21:46 11:21:46 def raw_decode(self, s, idx=0): 11:21:46 """Decode a JSON document from ``s`` (a ``str`` beginning with 11:21:46 a JSON document) and return a 2-tuple of the Python 11:21:46 representation and the index in ``s`` where the document ended. 11:21:46 11:21:46 This can be used to decode a JSON document from a string that may 11:21:46 have extraneous data at the end. 11:21:46 11:21:46 """ 11:21:46 try: 11:21:46 obj, end = self.scan_once(s, idx) 11:21:46 except StopIteration as err: 11:21:46 > raise JSONDecodeError("Expecting value", s, err.value) from None 11:21:46 E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:355: JSONDecodeError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_36_create_oc_service1(self): 11:21:46 self.cr_serv_input_data["service-name"] = "service1" 11:21:46 self.cr_serv_input_data["connection-type"] = "roadm-line" 11:21:46 self.cr_serv_input_data["service-a-end"]["node-id"] = "ROADMA01" 11:21:46 self.cr_serv_input_data["service-a-end"]["service-format"] = "OC" 11:21:46 self.cr_serv_input_data["service-z-end"]["node-id"] = "ROADMC01" 11:21:46 self.cr_serv_input_data["service-z-end"]["service-format"] = "OC" 11:21:46 > response = test_utils.transportpce_api_rpc_request( 11:21:46 'org-openroadm-service', 'service-create', 11:21:46 self.cr_serv_input_data) 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:559: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:691: in transportpce_api_rpc_request 11:21:46 res = response.json() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 return complexjson.loads(self.text, **kwargs) 11:21:46 except JSONDecodeError as e: 11:21:46 # Catch JSON-related errors and raise as requests.JSONDecodeError 11:21:46 # This aliases json.JSONDecodeError and simplejson.JSONDecodeError 11:21:46 > raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 E requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:978: JSONDecodeError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_36_create_oc_service1 11:21:46 _______________ TransportPCEFulltesting.test_37_get_oc_service1 ________________ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 > return complexjson.loads(self.text, **kwargs) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:974: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:346: in loads 11:21:46 return _default_decoder.decode(s) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:337: in decode 11:21:46 obj, end = self.raw_decode(s, idx=_w(s, 0).end()) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , s = '', idx = 0 11:21:46 11:21:46 def raw_decode(self, s, idx=0): 11:21:46 """Decode a JSON document from ``s`` (a ``str`` beginning with 11:21:46 a JSON document) and return a 2-tuple of the Python 11:21:46 representation and the index in ``s`` where the document ended. 11:21:46 11:21:46 This can be used to decode a JSON document from a string that may 11:21:46 have extraneous data at the end. 11:21:46 11:21:46 """ 11:21:46 try: 11:21:46 obj, end = self.scan_once(s, idx) 11:21:46 except StopIteration as err: 11:21:46 > raise JSONDecodeError("Expecting value", s, err.value) from None 11:21:46 E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:355: JSONDecodeError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_37_get_oc_service1(self): 11:21:46 > response = test_utils.get_ordm_serv_list_attr_request("services", "service1") 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:568: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:633: in get_ordm_serv_list_attr_request 11:21:46 res = response.json() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 return complexjson.loads(self.text, **kwargs) 11:21:46 except JSONDecodeError as e: 11:21:46 # Catch JSON-related errors and raise as requests.JSONDecodeError 11:21:46 # This aliases json.JSONDecodeError and simplejson.JSONDecodeError 11:21:46 > raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 E requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:978: JSONDecodeError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_37_get_oc_service1 11:21:46 _______________ TransportPCEFulltesting.test_38_check_xc1_ROADMA _______________ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 > return complexjson.loads(self.text, **kwargs) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:974: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:346: in loads 11:21:46 return _default_decoder.decode(s) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:337: in decode 11:21:46 obj, end = self.raw_decode(s, idx=_w(s, 0).end()) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , s = '', idx = 0 11:21:46 11:21:46 def raw_decode(self, s, idx=0): 11:21:46 """Decode a JSON document from ``s`` (a ``str`` beginning with 11:21:46 a JSON document) and return a 2-tuple of the Python 11:21:46 representation and the index in ``s`` where the document ended. 11:21:46 11:21:46 This can be used to decode a JSON document from a string that may 11:21:46 have extraneous data at the end. 11:21:46 11:21:46 """ 11:21:46 try: 11:21:46 obj, end = self.scan_once(s, idx) 11:21:46 except StopIteration as err: 11:21:46 > raise JSONDecodeError("Expecting value", s, err.value) from None 11:21:46 E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:355: JSONDecodeError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_38_check_xc1_ROADMA(self): 11:21:46 > response = test_utils.check_node_attribute_request( 11:21:46 "ROADMA01", "roadm-connections", "SRG1-PP1-TXRX-DEG1-TTP-TXRX-761:768") 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:577: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:405: in check_node_attribute_request 11:21:46 res = response.json() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 return complexjson.loads(self.text, **kwargs) 11:21:46 except JSONDecodeError as e: 11:21:46 # Catch JSON-related errors and raise as requests.JSONDecodeError 11:21:46 # This aliases json.JSONDecodeError and simplejson.JSONDecodeError 11:21:46 > raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 E requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:978: JSONDecodeError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_38_check_xc1_ROADMA 11:21:46 _______________ TransportPCEFulltesting.test_39_check_xc1_ROADMC _______________ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 > return complexjson.loads(self.text, **kwargs) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:974: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:346: in loads 11:21:46 return _default_decoder.decode(s) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:337: in decode 11:21:46 obj, end = self.raw_decode(s, idx=_w(s, 0).end()) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , s = '', idx = 0 11:21:46 11:21:46 def raw_decode(self, s, idx=0): 11:21:46 """Decode a JSON document from ``s`` (a ``str`` beginning with 11:21:46 a JSON document) and return a 2-tuple of the Python 11:21:46 representation and the index in ``s`` where the document ended. 11:21:46 11:21:46 This can be used to decode a JSON document from a string that may 11:21:46 have extraneous data at the end. 11:21:46 11:21:46 """ 11:21:46 try: 11:21:46 obj, end = self.scan_once(s, idx) 11:21:46 except StopIteration as err: 11:21:46 > raise JSONDecodeError("Expecting value", s, err.value) from None 11:21:46 E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:355: JSONDecodeError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_39_check_xc1_ROADMC(self): 11:21:46 > response = test_utils.check_node_attribute_request( 11:21:46 "ROADMC01", "roadm-connections", "SRG1-PP1-TXRX-DEG2-TTP-TXRX-761:768") 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:593: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:405: in check_node_attribute_request 11:21:46 res = response.json() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 return complexjson.loads(self.text, **kwargs) 11:21:46 except JSONDecodeError as e: 11:21:46 # Catch JSON-related errors and raise as requests.JSONDecodeError 11:21:46 # This aliases json.JSONDecodeError and simplejson.JSONDecodeError 11:21:46 > raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 E requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:978: JSONDecodeError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_39_check_xc1_ROADMC 11:21:46 ______________ TransportPCEFulltesting.test_40_create_oc_service2 ______________ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 > return complexjson.loads(self.text, **kwargs) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:974: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:346: in loads 11:21:46 return _default_decoder.decode(s) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:337: in decode 11:21:46 obj, end = self.raw_decode(s, idx=_w(s, 0).end()) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , s = '', idx = 0 11:21:46 11:21:46 def raw_decode(self, s, idx=0): 11:21:46 """Decode a JSON document from ``s`` (a ``str`` beginning with 11:21:46 a JSON document) and return a 2-tuple of the Python 11:21:46 representation and the index in ``s`` where the document ended. 11:21:46 11:21:46 This can be used to decode a JSON document from a string that may 11:21:46 have extraneous data at the end. 11:21:46 11:21:46 """ 11:21:46 try: 11:21:46 obj, end = self.scan_once(s, idx) 11:21:46 except StopIteration as err: 11:21:46 > raise JSONDecodeError("Expecting value", s, err.value) from None 11:21:46 E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:355: JSONDecodeError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_40_create_oc_service2(self): 11:21:46 self.cr_serv_input_data["service-name"] = "service2" 11:21:46 self.cr_serv_input_data["connection-type"] = "roadm-line" 11:21:46 self.cr_serv_input_data["service-a-end"]["node-id"] = "ROADMA01" 11:21:46 self.cr_serv_input_data["service-a-end"]["service-format"] = "OC" 11:21:46 self.cr_serv_input_data["service-z-end"]["node-id"] = "ROADMC01" 11:21:46 self.cr_serv_input_data["service-z-end"]["service-format"] = "OC" 11:21:46 > response = test_utils.transportpce_api_rpc_request( 11:21:46 'org-openroadm-service', 'service-create', 11:21:46 self.cr_serv_input_data) 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:615: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:691: in transportpce_api_rpc_request 11:21:46 res = response.json() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 return complexjson.loads(self.text, **kwargs) 11:21:46 except JSONDecodeError as e: 11:21:46 # Catch JSON-related errors and raise as requests.JSONDecodeError 11:21:46 # This aliases json.JSONDecodeError and simplejson.JSONDecodeError 11:21:46 > raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 E requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:978: JSONDecodeError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_40_create_oc_service2 11:21:46 _______________ TransportPCEFulltesting.test_41_get_oc_service2 ________________ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 > return complexjson.loads(self.text, **kwargs) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:974: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:346: in loads 11:21:46 return _default_decoder.decode(s) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:337: in decode 11:21:46 obj, end = self.raw_decode(s, idx=_w(s, 0).end()) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , s = '', idx = 0 11:21:46 11:21:46 def raw_decode(self, s, idx=0): 11:21:46 """Decode a JSON document from ``s`` (a ``str`` beginning with 11:21:46 a JSON document) and return a 2-tuple of the Python 11:21:46 representation and the index in ``s`` where the document ended. 11:21:46 11:21:46 This can be used to decode a JSON document from a string that may 11:21:46 have extraneous data at the end. 11:21:46 11:21:46 """ 11:21:46 try: 11:21:46 obj, end = self.scan_once(s, idx) 11:21:46 except StopIteration as err: 11:21:46 > raise JSONDecodeError("Expecting value", s, err.value) from None 11:21:46 E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:355: JSONDecodeError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_41_get_oc_service2(self): 11:21:46 > response = test_utils.get_ordm_serv_list_attr_request("services", "service2") 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:624: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:633: in get_ordm_serv_list_attr_request 11:21:46 res = response.json() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 return complexjson.loads(self.text, **kwargs) 11:21:46 except JSONDecodeError as e: 11:21:46 # Catch JSON-related errors and raise as requests.JSONDecodeError 11:21:46 # This aliases json.JSONDecodeError and simplejson.JSONDecodeError 11:21:46 > raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 E requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:978: JSONDecodeError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_41_get_oc_service2 11:21:46 _______________ TransportPCEFulltesting.test_42_check_xc2_ROADMA _______________ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 > return complexjson.loads(self.text, **kwargs) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:974: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:346: in loads 11:21:46 return _default_decoder.decode(s) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:337: in decode 11:21:46 obj, end = self.raw_decode(s, idx=_w(s, 0).end()) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , s = '', idx = 0 11:21:46 11:21:46 def raw_decode(self, s, idx=0): 11:21:46 """Decode a JSON document from ``s`` (a ``str`` beginning with 11:21:46 a JSON document) and return a 2-tuple of the Python 11:21:46 representation and the index in ``s`` where the document ended. 11:21:46 11:21:46 This can be used to decode a JSON document from a string that may 11:21:46 have extraneous data at the end. 11:21:46 11:21:46 """ 11:21:46 try: 11:21:46 obj, end = self.scan_once(s, idx) 11:21:46 except StopIteration as err: 11:21:46 > raise JSONDecodeError("Expecting value", s, err.value) from None 11:21:46 E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:355: JSONDecodeError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_42_check_xc2_ROADMA(self): 11:21:46 > response = test_utils.check_node_attribute_request( 11:21:46 "ROADMA01", "roadm-connections", "SRG1-PP2-TXRX-DEG1-TTP-TXRX-753:760") 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:633: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:405: in check_node_attribute_request 11:21:46 res = response.json() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 return complexjson.loads(self.text, **kwargs) 11:21:46 except JSONDecodeError as e: 11:21:46 # Catch JSON-related errors and raise as requests.JSONDecodeError 11:21:46 # This aliases json.JSONDecodeError and simplejson.JSONDecodeError 11:21:46 > raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 E requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:978: JSONDecodeError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_42_check_xc2_ROADMA 11:21:46 ______________ TransportPCEFulltesting.test_43_check_topo_ROADMA _______________ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_43_check_topo_ROADMA(self): 11:21:46 > self.test_26_check_topo_ROADMA_SRG1() 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:649: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:380: in test_26_check_topo_ROADMA_SRG1 11:21:46 self.assertEqual(response['status_code'], requests.codes.ok) 11:21:46 E AssertionError: 404 != 200 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_43_check_topo_ROADMA 11:21:46 ______________ TransportPCEFulltesting.test_44_delete_oc_service1 ______________ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 > return complexjson.loads(self.text, **kwargs) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:974: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:346: in loads 11:21:46 return _default_decoder.decode(s) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:337: in decode 11:21:46 obj, end = self.raw_decode(s, idx=_w(s, 0).end()) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , s = '', idx = 0 11:21:46 11:21:46 def raw_decode(self, s, idx=0): 11:21:46 """Decode a JSON document from ``s`` (a ``str`` beginning with 11:21:46 a JSON document) and return a 2-tuple of the Python 11:21:46 representation and the index in ``s`` where the document ended. 11:21:46 11:21:46 This can be used to decode a JSON document from a string that may 11:21:46 have extraneous data at the end. 11:21:46 11:21:46 """ 11:21:46 try: 11:21:46 obj, end = self.scan_once(s, idx) 11:21:46 except StopIteration as err: 11:21:46 > raise JSONDecodeError("Expecting value", s, err.value) from None 11:21:46 E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:355: JSONDecodeError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_44_delete_oc_service1(self): 11:21:46 self.del_serv_input_data["service-delete-req-info"]["service-name"] = "service1" 11:21:46 > response = test_utils.transportpce_api_rpc_request( 11:21:46 'org-openroadm-service', 'service-delete', 11:21:46 self.del_serv_input_data) 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:655: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:691: in transportpce_api_rpc_request 11:21:46 res = response.json() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 return complexjson.loads(self.text, **kwargs) 11:21:46 except JSONDecodeError as e: 11:21:46 # Catch JSON-related errors and raise as requests.JSONDecodeError 11:21:46 # This aliases json.JSONDecodeError and simplejson.JSONDecodeError 11:21:46 > raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 E requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:978: JSONDecodeError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_44_delete_oc_service1 11:21:46 ______________ TransportPCEFulltesting.test_45_delete_oc_service2 ______________ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 > return complexjson.loads(self.text, **kwargs) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:974: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:346: in loads 11:21:46 return _default_decoder.decode(s) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:337: in decode 11:21:46 obj, end = self.raw_decode(s, idx=_w(s, 0).end()) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , s = '', idx = 0 11:21:46 11:21:46 def raw_decode(self, s, idx=0): 11:21:46 """Decode a JSON document from ``s`` (a ``str`` beginning with 11:21:46 a JSON document) and return a 2-tuple of the Python 11:21:46 representation and the index in ``s`` where the document ended. 11:21:46 11:21:46 This can be used to decode a JSON document from a string that may 11:21:46 have extraneous data at the end. 11:21:46 11:21:46 """ 11:21:46 try: 11:21:46 obj, end = self.scan_once(s, idx) 11:21:46 except StopIteration as err: 11:21:46 > raise JSONDecodeError("Expecting value", s, err.value) from None 11:21:46 E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:355: JSONDecodeError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_45_delete_oc_service2(self): 11:21:46 self.del_serv_input_data["service-delete-req-info"]["service-name"] = "service2" 11:21:46 > response = test_utils.transportpce_api_rpc_request( 11:21:46 'org-openroadm-service', 'service-delete', 11:21:46 self.del_serv_input_data) 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:665: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:691: in transportpce_api_rpc_request 11:21:46 res = response.json() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 return complexjson.loads(self.text, **kwargs) 11:21:46 except JSONDecodeError as e: 11:21:46 # Catch JSON-related errors and raise as requests.JSONDecodeError 11:21:46 # This aliases json.JSONDecodeError and simplejson.JSONDecodeError 11:21:46 > raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 E requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:978: JSONDecodeError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_45_delete_oc_service2 11:21:46 ______________ TransportPCEFulltesting.test_46_get_no_oc_services ______________ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 > return complexjson.loads(self.text, **kwargs) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:974: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:346: in loads 11:21:46 return _default_decoder.decode(s) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:337: in decode 11:21:46 obj, end = self.raw_decode(s, idx=_w(s, 0).end()) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , s = '', idx = 0 11:21:46 11:21:46 def raw_decode(self, s, idx=0): 11:21:46 """Decode a JSON document from ``s`` (a ``str`` beginning with 11:21:46 a JSON document) and return a 2-tuple of the Python 11:21:46 representation and the index in ``s`` where the document ended. 11:21:46 11:21:46 This can be used to decode a JSON document from a string that may 11:21:46 have extraneous data at the end. 11:21:46 11:21:46 """ 11:21:46 try: 11:21:46 obj, end = self.scan_once(s, idx) 11:21:46 except StopIteration as err: 11:21:46 > raise JSONDecodeError("Expecting value", s, err.value) from None 11:21:46 E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:355: JSONDecodeError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_46_get_no_oc_services(self): 11:21:46 > response = test_utils.get_ordm_serv_list_request() 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:674: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:617: in get_ordm_serv_list_request 11:21:46 res = response.json() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 return complexjson.loads(self.text, **kwargs) 11:21:46 except JSONDecodeError as e: 11:21:46 # Catch JSON-related errors and raise as requests.JSONDecodeError 11:21:46 # This aliases json.JSONDecodeError and simplejson.JSONDecodeError 11:21:46 > raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 E requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:978: JSONDecodeError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_46_get_no_oc_services 11:21:46 _______________ TransportPCEFulltesting.test_47_get_no_xc_ROADMA _______________ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 > return complexjson.loads(self.text, **kwargs) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:974: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:346: in loads 11:21:46 return _default_decoder.decode(s) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:337: in decode 11:21:46 obj, end = self.raw_decode(s, idx=_w(s, 0).end()) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , s = '', idx = 0 11:21:46 11:21:46 def raw_decode(self, s, idx=0): 11:21:46 """Decode a JSON document from ``s`` (a ``str`` beginning with 11:21:46 a JSON document) and return a 2-tuple of the Python 11:21:46 representation and the index in ``s`` where the document ended. 11:21:46 11:21:46 This can be used to decode a JSON document from a string that may 11:21:46 have extraneous data at the end. 11:21:46 11:21:46 """ 11:21:46 try: 11:21:46 obj, end = self.scan_once(s, idx) 11:21:46 except StopIteration as err: 11:21:46 > raise JSONDecodeError("Expecting value", s, err.value) from None 11:21:46 E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:355: JSONDecodeError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_47_get_no_xc_ROADMA(self): 11:21:46 > response = test_utils.check_node_request("ROADMA01") 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:693: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:389: in check_node_request 11:21:46 res = response.json() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 return complexjson.loads(self.text, **kwargs) 11:21:46 except JSONDecodeError as e: 11:21:46 # Catch JSON-related errors and raise as requests.JSONDecodeError 11:21:46 # This aliases json.JSONDecodeError and simplejson.JSONDecodeError 11:21:46 > raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 E requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:978: JSONDecodeError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_47_get_no_xc_ROADMA 11:21:46 ______________ TransportPCEFulltesting.test_48_check_topo_ROADMA _______________ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_48_check_topo_ROADMA(self): 11:21:46 > self.test_34_check_topo_ROADMA_SRG1() 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:699: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:502: in test_34_check_topo_ROADMA_SRG1 11:21:46 self.assertEqual(response['status_code'], requests.codes.ok) 11:21:46 E AssertionError: 404 != 200 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_48_check_topo_ROADMA 11:21:46 ___________ TransportPCEFulltesting.test_49_loop_create_eth_service ____________ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 > return complexjson.loads(self.text, **kwargs) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:974: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:346: in loads 11:21:46 return _default_decoder.decode(s) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:337: in decode 11:21:46 obj, end = self.raw_decode(s, idx=_w(s, 0).end()) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , s = '', idx = 0 11:21:46 11:21:46 def raw_decode(self, s, idx=0): 11:21:46 """Decode a JSON document from ``s`` (a ``str`` beginning with 11:21:46 a JSON document) and return a 2-tuple of the Python 11:21:46 representation and the index in ``s`` where the document ended. 11:21:46 11:21:46 This can be used to decode a JSON document from a string that may 11:21:46 have extraneous data at the end. 11:21:46 11:21:46 """ 11:21:46 try: 11:21:46 obj, end = self.scan_once(s, idx) 11:21:46 except StopIteration as err: 11:21:46 > raise JSONDecodeError("Expecting value", s, err.value) from None 11:21:46 E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:355: JSONDecodeError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_49_loop_create_eth_service(self): 11:21:46 # pylint: disable=consider-using-f-string 11:21:46 for i in range(1, 4): 11:21:46 print("iteration number {}".format(i)) 11:21:46 print("eth service creation") 11:21:46 > self.test_11_create_eth_service1() 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:707: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:181: in test_11_create_eth_service1 11:21:46 response = test_utils.transportpce_api_rpc_request( 11:21:46 transportpce_tests/common/test_utils.py:691: in transportpce_api_rpc_request 11:21:46 res = response.json() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 return complexjson.loads(self.text, **kwargs) 11:21:46 except JSONDecodeError as e: 11:21:46 # Catch JSON-related errors and raise as requests.JSONDecodeError 11:21:46 # This aliases json.JSONDecodeError and simplejson.JSONDecodeError 11:21:46 > raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 E requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:978: JSONDecodeError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_49_loop_create_eth_service 11:21:46 iteration number 1 11:21:46 eth service creation 11:21:46 ____________ TransportPCEFulltesting.test_50_loop_create_oc_service ____________ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 > return complexjson.loads(self.text, **kwargs) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:974: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:346: in loads 11:21:46 return _default_decoder.decode(s) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:337: in decode 11:21:46 obj, end = self.raw_decode(s, idx=_w(s, 0).end()) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , s = '', idx = 0 11:21:46 11:21:46 def raw_decode(self, s, idx=0): 11:21:46 """Decode a JSON document from ``s`` (a ``str`` beginning with 11:21:46 a JSON document) and return a 2-tuple of the Python 11:21:46 representation and the index in ``s`` where the document ended. 11:21:46 11:21:46 This can be used to decode a JSON document from a string that may 11:21:46 have extraneous data at the end. 11:21:46 11:21:46 """ 11:21:46 try: 11:21:46 obj, end = self.scan_once(s, idx) 11:21:46 except StopIteration as err: 11:21:46 > raise JSONDecodeError("Expecting value", s, err.value) from None 11:21:46 E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/json/decoder.py:355: JSONDecodeError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_50_loop_create_oc_service(self): 11:21:46 > response = test_utils.get_ordm_serv_list_attr_request("services", "service1") 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:716: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:633: in get_ordm_serv_list_attr_request 11:21:46 res = response.json() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = , kwargs = {} 11:21:46 11:21:46 def json(self, **kwargs): 11:21:46 r"""Returns the json-encoded content of a response, if any. 11:21:46 11:21:46 :param \*\*kwargs: Optional arguments that ``json.loads`` takes. 11:21:46 :raises requests.exceptions.JSONDecodeError: If the response body does not 11:21:46 contain valid json. 11:21:46 """ 11:21:46 11:21:46 if not self.encoding and self.content and len(self.content) > 3: 11:21:46 # No encoding set. JSON RFC 4627 section 3 states we should expect 11:21:46 # UTF-8, -16 or -32. Detect which one to use; If the detection or 11:21:46 # decoding fails, fall back to `self.text` (using charset_normalizer to make 11:21:46 # a best guess). 11:21:46 encoding = guess_json_utf(self.content) 11:21:46 if encoding is not None: 11:21:46 try: 11:21:46 return complexjson.loads(self.content.decode(encoding), **kwargs) 11:21:46 except UnicodeDecodeError: 11:21:46 # Wrong UTF codec detected; usually because it's not UTF-8 11:21:46 # but some other 8-bit codec. This is an RFC violation, 11:21:46 # and the server didn't bother to tell us what codec *was* 11:21:46 # used. 11:21:46 pass 11:21:46 except JSONDecodeError as e: 11:21:46 raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 11:21:46 try: 11:21:46 return complexjson.loads(self.text, **kwargs) 11:21:46 except JSONDecodeError as e: 11:21:46 # Catch JSON-related errors and raise as requests.JSONDecodeError 11:21:46 # This aliases json.JSONDecodeError and simplejson.JSONDecodeError 11:21:46 > raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) 11:21:46 E requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/models.py:978: JSONDecodeError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_50_loop_create_oc_service 11:21:46 _______________ TransportPCEFulltesting.test_51_disconnect_XPDRA _______________ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_51_disconnect_XPDRA(self): 11:21:46 response = test_utils.unmount_device("XPDRA01") 11:21:46 > self.assertIn(response.status_code, (requests.codes.ok, requests.codes.no_content)) 11:21:46 E AssertionError: 404 not found in (200, 204) 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:738: AssertionError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_51_disconnect_XPDRA 11:21:46 Searching for patterns in karaf.log... Pattern not found after 180 seconds! Node XPDRA01 still not deleted from tpce topology... 11:21:46 _______________ TransportPCEFulltesting.test_52_disconnect_XPDRC _______________ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_52_disconnect_XPDRC(self): 11:21:46 response = test_utils.unmount_device("XPDRC01") 11:21:46 > self.assertIn(response.status_code, (requests.codes.ok, requests.codes.no_content)) 11:21:46 E AssertionError: 404 not found in (200, 204) 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:742: AssertionError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_52_disconnect_XPDRC 11:21:46 Searching for patterns in karaf.log... Pattern not found after 180 seconds! Node XPDRC01 still not deleted from tpce topology... 11:21:46 ______________ TransportPCEFulltesting.test_53_disconnect_ROADMA _______________ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 > sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:46 raise err 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:46 socket_options = [(6, 1, 1)] 11:21:46 11:21:46 def create_connection( 11:21:46 address: tuple[str, int], 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 source_address: tuple[str, int] | None = None, 11:21:46 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:46 ) -> socket.socket: 11:21:46 """Connect to *address* and return the socket object. 11:21:46 11:21:46 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:46 port)``) and return the socket object. Passing the optional 11:21:46 *timeout* parameter will set the timeout on the socket instance 11:21:46 before attempting to connect. If no *timeout* is supplied, the 11:21:46 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:46 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:46 for the socket to bind as a source address before making the connection. 11:21:46 An host of '' or port 0 tells the OS to use the default. 11:21:46 """ 11:21:46 11:21:46 host, port = address 11:21:46 if host.startswith("["): 11:21:46 host = host.strip("[]") 11:21:46 err = None 11:21:46 11:21:46 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:46 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:46 # The original create_connection function always returns all records. 11:21:46 family = allowed_gai_family() 11:21:46 11:21:46 try: 11:21:46 host.encode("idna") 11:21:46 except UnicodeError: 11:21:46 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:46 11:21:46 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:46 af, socktype, proto, canonname, sa = res 11:21:46 sock = None 11:21:46 try: 11:21:46 sock = socket.socket(af, socktype, proto) 11:21:46 11:21:46 # If provided, set socket level options before connecting. 11:21:46 _set_socket_options(sock, socket_options) 11:21:46 11:21:46 if timeout is not _DEFAULT_TIMEOUT: 11:21:46 sock.settimeout(timeout) 11:21:46 if source_address: 11:21:46 sock.bind(source_address) 11:21:46 > sock.connect(sa) 11:21:46 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 method = 'DELETE' 11:21:46 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01' 11:21:46 body = None 11:21:46 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '0', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:46 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 redirect = False, assert_same_host = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:46 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:46 decode_content = False, response_kw = {} 11:21:46 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01', query=None, fragment=None) 11:21:46 destination_scheme = None, conn = None, release_this_conn = True 11:21:46 http_tunnel_required = False, err = None, clean_exit = False 11:21:46 11:21:46 def urlopen( # type: ignore[override] 11:21:46 self, 11:21:46 method: str, 11:21:46 url: str, 11:21:46 body: _TYPE_BODY | None = None, 11:21:46 headers: typing.Mapping[str, str] | None = None, 11:21:46 retries: Retry | bool | int | None = None, 11:21:46 redirect: bool = True, 11:21:46 assert_same_host: bool = True, 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 pool_timeout: int | None = None, 11:21:46 release_conn: bool | None = None, 11:21:46 chunked: bool = False, 11:21:46 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:46 preload_content: bool = True, 11:21:46 decode_content: bool = True, 11:21:46 **response_kw: typing.Any, 11:21:46 ) -> BaseHTTPResponse: 11:21:46 """ 11:21:46 Get a connection from the pool and perform an HTTP request. This is the 11:21:46 lowest level call for making a request, so you'll need to specify all 11:21:46 the raw details. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 More commonly, it's appropriate to use a convenience method 11:21:46 such as :meth:`request`. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 `release_conn` will only behave as expected if 11:21:46 `preload_content=False` because we want to make 11:21:46 `preload_content=False` the default behaviour someday soon without 11:21:46 breaking backwards compatibility. 11:21:46 11:21:46 :param method: 11:21:46 HTTP request method (such as GET, POST, PUT, etc.) 11:21:46 11:21:46 :param url: 11:21:46 The URL to perform the request on. 11:21:46 11:21:46 :param body: 11:21:46 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:46 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:46 11:21:46 :param headers: 11:21:46 Dictionary of custom headers to send, such as User-Agent, 11:21:46 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:46 these headers completely replace any pool-specific headers. 11:21:46 11:21:46 :param retries: 11:21:46 Configure the number of retries to allow before raising a 11:21:46 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:46 11:21:46 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:46 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:46 over different types of retries. 11:21:46 Pass an integer number to retry connection errors that many times, 11:21:46 but no other types of errors. Pass zero to never retry. 11:21:46 11:21:46 If ``False``, then retries are disabled and any exception is raised 11:21:46 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:46 the redirect response will be returned. 11:21:46 11:21:46 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:46 11:21:46 :param redirect: 11:21:46 If True, automatically handle redirects (status codes 301, 302, 11:21:46 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:46 will disable redirect, too. 11:21:46 11:21:46 :param assert_same_host: 11:21:46 If ``True``, will make sure that the host of the pool requests is 11:21:46 consistent else will raise HostChangedError. When ``False``, you can 11:21:46 use the pool on an HTTP proxy and request foreign hosts. 11:21:46 11:21:46 :param timeout: 11:21:46 If specified, overrides the default timeout for this one 11:21:46 request. It may be a float (in seconds) or an instance of 11:21:46 :class:`urllib3.util.Timeout`. 11:21:46 11:21:46 :param pool_timeout: 11:21:46 If set and the pool is set to block=True, then this method will 11:21:46 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:46 connection is available within the time period. 11:21:46 11:21:46 :param bool preload_content: 11:21:46 If True, the response's body will be preloaded into memory. 11:21:46 11:21:46 :param bool decode_content: 11:21:46 If True, will attempt to decode the body based on the 11:21:46 'content-encoding' header. 11:21:46 11:21:46 :param release_conn: 11:21:46 If False, then the urlopen call will not release the connection 11:21:46 back into the pool once a response is received (but will release if 11:21:46 you read the entire contents of the response such as when 11:21:46 `preload_content=True`). This is useful if you're not preloading 11:21:46 the response's content immediately. You will need to call 11:21:46 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:46 back into the pool. If None, it takes the value of ``preload_content`` 11:21:46 which defaults to ``True``. 11:21:46 11:21:46 :param bool chunked: 11:21:46 If True, urllib3 will send the body using chunked transfer 11:21:46 encoding. Otherwise, urllib3 will send the body using the standard 11:21:46 content-length form. Defaults to False. 11:21:46 11:21:46 :param int body_pos: 11:21:46 Position to seek to in file-like body in the event of a retry or 11:21:46 redirect. Typically this won't need to be set because urllib3 will 11:21:46 auto-populate the value when needed. 11:21:46 """ 11:21:46 parsed_url = parse_url(url) 11:21:46 destination_scheme = parsed_url.scheme 11:21:46 11:21:46 if headers is None: 11:21:46 headers = self.headers 11:21:46 11:21:46 if not isinstance(retries, Retry): 11:21:46 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:46 11:21:46 if release_conn is None: 11:21:46 release_conn = preload_content 11:21:46 11:21:46 # Check host 11:21:46 if assert_same_host and not self.is_same_host(url): 11:21:46 raise HostChangedError(self, url, retries) 11:21:46 11:21:46 # Ensure that the URL we're connecting to is properly encoded 11:21:46 if url.startswith("/"): 11:21:46 url = to_str(_encode_target(url)) 11:21:46 else: 11:21:46 url = to_str(parsed_url.url) 11:21:46 11:21:46 conn = None 11:21:46 11:21:46 # Track whether `conn` needs to be released before 11:21:46 # returning/raising/recursing. Update this variable if necessary, and 11:21:46 # leave `release_conn` constant throughout the function. That way, if 11:21:46 # the function recurses, the original value of `release_conn` will be 11:21:46 # passed down into the recursive call, and its value will be respected. 11:21:46 # 11:21:46 # See issue #651 [1] for details. 11:21:46 # 11:21:46 # [1] 11:21:46 release_this_conn = release_conn 11:21:46 11:21:46 http_tunnel_required = connection_requires_http_tunnel( 11:21:46 self.proxy, self.proxy_config, destination_scheme 11:21:46 ) 11:21:46 11:21:46 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:46 # have to copy the headers dict so we can safely change it without those 11:21:46 # changes being reflected in anyone else's copy. 11:21:46 if not http_tunnel_required: 11:21:46 headers = headers.copy() # type: ignore[attr-defined] 11:21:46 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:46 11:21:46 # Must keep the exception bound to a separate variable or else Python 3 11:21:46 # complains about UnboundLocalError. 11:21:46 err = None 11:21:46 11:21:46 # Keep track of whether we cleanly exited the except block. This 11:21:46 # ensures we do proper cleanup in finally. 11:21:46 clean_exit = False 11:21:46 11:21:46 # Rewind body position, if needed. Record current position 11:21:46 # for future rewinds in the event of a redirect/retry. 11:21:46 body_pos = set_file_position(body, body_pos) 11:21:46 11:21:46 try: 11:21:46 # Request a connection from the queue. 11:21:46 timeout_obj = self._get_timeout(timeout) 11:21:46 conn = self._get_conn(timeout=pool_timeout) 11:21:46 11:21:46 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:46 11:21:46 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:46 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:46 try: 11:21:46 self._prepare_proxy(conn) 11:21:46 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:46 self._raise_timeout( 11:21:46 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:46 ) 11:21:46 raise 11:21:46 11:21:46 # If we're going to release the connection in ``finally:``, then 11:21:46 # the response doesn't need to know about the connection. Otherwise 11:21:46 # it will also try to release it and we'll have a double-release 11:21:46 # mess. 11:21:46 response_conn = conn if not release_conn else None 11:21:46 11:21:46 # Make the request on the HTTPConnection object 11:21:46 > response = self._make_request( 11:21:46 conn, 11:21:46 method, 11:21:46 url, 11:21:46 timeout=timeout_obj, 11:21:46 body=body, 11:21:46 headers=headers, 11:21:46 chunked=chunked, 11:21:46 retries=retries, 11:21:46 response_conn=response_conn, 11:21:46 preload_content=preload_content, 11:21:46 decode_content=decode_content, 11:21:46 **response_kw, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:46 conn.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:46 self.endheaders() 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:46 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:46 self.send(msg) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:46 self.connect() 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:46 self.sock = self._new_conn() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 except socket.gaierror as e: 11:21:46 raise NameResolutionError(self.host, self, e) from e 11:21:46 except SocketTimeout as e: 11:21:46 raise ConnectTimeoutError( 11:21:46 self, 11:21:46 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:46 ) from e 11:21:46 11:21:46 except OSError as e: 11:21:46 > raise NewConnectionError( 11:21:46 self, f"Failed to establish a new connection: {e}" 11:21:46 ) from e 11:21:46 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 > resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:46 retries = retries.increment( 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 method = 'DELETE' 11:21:46 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01' 11:21:46 response = None 11:21:46 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:46 _pool = 11:21:46 _stacktrace = 11:21:46 11:21:46 def increment( 11:21:46 self, 11:21:46 method: str | None = None, 11:21:46 url: str | None = None, 11:21:46 response: BaseHTTPResponse | None = None, 11:21:46 error: Exception | None = None, 11:21:46 _pool: ConnectionPool | None = None, 11:21:46 _stacktrace: TracebackType | None = None, 11:21:46 ) -> Self: 11:21:46 """Return a new Retry object with incremented retry counters. 11:21:46 11:21:46 :param response: A response object, or None, if the server did not 11:21:46 return a response. 11:21:46 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:46 :param Exception error: An error encountered during the request, or 11:21:46 None if the response was received successfully. 11:21:46 11:21:46 :return: A new ``Retry`` object. 11:21:46 """ 11:21:46 if self.total is False and error: 11:21:46 # Disabled, indicate to re-raise the error. 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 11:21:46 total = self.total 11:21:46 if total is not None: 11:21:46 total -= 1 11:21:46 11:21:46 connect = self.connect 11:21:46 read = self.read 11:21:46 redirect = self.redirect 11:21:46 status_count = self.status 11:21:46 other = self.other 11:21:46 cause = "unknown" 11:21:46 status = None 11:21:46 redirect_location = None 11:21:46 11:21:46 if error and self._is_connection_error(error): 11:21:46 # Connect retry? 11:21:46 if connect is False: 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif connect is not None: 11:21:46 connect -= 1 11:21:46 11:21:46 elif error and self._is_read_error(error): 11:21:46 # Read retry? 11:21:46 if read is False or method is None or not self._is_method_retryable(method): 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif read is not None: 11:21:46 read -= 1 11:21:46 11:21:46 elif error: 11:21:46 # Other retry? 11:21:46 if other is not None: 11:21:46 other -= 1 11:21:46 11:21:46 elif response and response.get_redirect_location(): 11:21:46 # Redirect retry? 11:21:46 if redirect is not None: 11:21:46 redirect -= 1 11:21:46 cause = "too many redirects" 11:21:46 response_redirect_location = response.get_redirect_location() 11:21:46 if response_redirect_location: 11:21:46 redirect_location = response_redirect_location 11:21:46 status = response.status 11:21:46 11:21:46 else: 11:21:46 # Incrementing because of a server error like a 500 in 11:21:46 # status_forcelist and the given method is in the allowed_methods 11:21:46 cause = ResponseError.GENERIC_ERROR 11:21:46 if response and response.status: 11:21:46 if status_count is not None: 11:21:46 status_count -= 1 11:21:46 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:46 status = response.status 11:21:46 11:21:46 history = self.history + ( 11:21:46 RequestHistory(method, url, error, status, redirect_location), 11:21:46 ) 11:21:46 11:21:46 new_retry = self.new( 11:21:46 total=total, 11:21:46 connect=connect, 11:21:46 read=read, 11:21:46 redirect=redirect, 11:21:46 status=status_count, 11:21:46 other=other, 11:21:46 history=history, 11:21:46 ) 11:21:46 11:21:46 if new_retry.is_exhausted(): 11:21:46 reason = error or ResponseError(cause) 11:21:46 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:46 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_53_disconnect_ROADMA(self): 11:21:46 > response = test_utils.unmount_device("ROADMA01") 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:745: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:360: in unmount_device 11:21:46 response = delete_request(url[RESTCONF_VERSION].format('{}', node)) 11:21:46 transportpce_tests/common/test_utils.py:133: in delete_request 11:21:46 return requests.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:46 return session.request(method=method, url=url, **kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:46 resp = self.send(prep, **send_kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:46 r = adapter.send(request, **kwargs) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 except (ProtocolError, OSError) as err: 11:21:46 raise ConnectionError(err, request=request) 11:21:46 11:21:46 except MaxRetryError as e: 11:21:46 if isinstance(e.reason, ConnectTimeoutError): 11:21:46 # TODO: Remove this in 3.0.0: see #2811 11:21:46 if not isinstance(e.reason, NewConnectionError): 11:21:46 raise ConnectTimeout(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, ResponseError): 11:21:46 raise RetryError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _ProxyError): 11:21:46 raise ProxyError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _SSLError): 11:21:46 # This branch is for urllib3 v1.22 and later. 11:21:46 raise SSLError(e, request=request) 11:21:46 11:21:46 > raise ConnectionError(e, request=request) 11:21:46 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_53_disconnect_ROADMA 11:21:46 ______________ TransportPCEFulltesting.test_54_disconnect_ROADMC _______________ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 > sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 11:21:46 raise err 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 address = ('localhost', 8182), timeout = 10, source_address = None 11:21:46 socket_options = [(6, 1, 1)] 11:21:46 11:21:46 def create_connection( 11:21:46 address: tuple[str, int], 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 source_address: tuple[str, int] | None = None, 11:21:46 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 11:21:46 ) -> socket.socket: 11:21:46 """Connect to *address* and return the socket object. 11:21:46 11:21:46 Convenience function. Connect to *address* (a 2-tuple ``(host, 11:21:46 port)``) and return the socket object. Passing the optional 11:21:46 *timeout* parameter will set the timeout on the socket instance 11:21:46 before attempting to connect. If no *timeout* is supplied, the 11:21:46 global default timeout setting returned by :func:`socket.getdefaulttimeout` 11:21:46 is used. If *source_address* is set it must be a tuple of (host, port) 11:21:46 for the socket to bind as a source address before making the connection. 11:21:46 An host of '' or port 0 tells the OS to use the default. 11:21:46 """ 11:21:46 11:21:46 host, port = address 11:21:46 if host.startswith("["): 11:21:46 host = host.strip("[]") 11:21:46 err = None 11:21:46 11:21:46 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 11:21:46 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 11:21:46 # The original create_connection function always returns all records. 11:21:46 family = allowed_gai_family() 11:21:46 11:21:46 try: 11:21:46 host.encode("idna") 11:21:46 except UnicodeError: 11:21:46 raise LocationParseError(f"'{host}', label empty or too long") from None 11:21:46 11:21:46 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 11:21:46 af, socktype, proto, canonname, sa = res 11:21:46 sock = None 11:21:46 try: 11:21:46 sock = socket.socket(af, socktype, proto) 11:21:46 11:21:46 # If provided, set socket level options before connecting. 11:21:46 _set_socket_options(sock, socket_options) 11:21:46 11:21:46 if timeout is not _DEFAULT_TIMEOUT: 11:21:46 sock.settimeout(timeout) 11:21:46 if source_address: 11:21:46 sock.bind(source_address) 11:21:46 > sock.connect(sa) 11:21:46 E ConnectionRefusedError: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 method = 'DELETE' 11:21:46 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01' 11:21:46 body = None 11:21:46 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '0', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 11:21:46 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 redirect = False, assert_same_host = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 11:21:46 release_conn = False, chunked = False, body_pos = None, preload_content = False 11:21:46 decode_content = False, response_kw = {} 11:21:46 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01', query=None, fragment=None) 11:21:46 destination_scheme = None, conn = None, release_this_conn = True 11:21:46 http_tunnel_required = False, err = None, clean_exit = False 11:21:46 11:21:46 def urlopen( # type: ignore[override] 11:21:46 self, 11:21:46 method: str, 11:21:46 url: str, 11:21:46 body: _TYPE_BODY | None = None, 11:21:46 headers: typing.Mapping[str, str] | None = None, 11:21:46 retries: Retry | bool | int | None = None, 11:21:46 redirect: bool = True, 11:21:46 assert_same_host: bool = True, 11:21:46 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 11:21:46 pool_timeout: int | None = None, 11:21:46 release_conn: bool | None = None, 11:21:46 chunked: bool = False, 11:21:46 body_pos: _TYPE_BODY_POSITION | None = None, 11:21:46 preload_content: bool = True, 11:21:46 decode_content: bool = True, 11:21:46 **response_kw: typing.Any, 11:21:46 ) -> BaseHTTPResponse: 11:21:46 """ 11:21:46 Get a connection from the pool and perform an HTTP request. This is the 11:21:46 lowest level call for making a request, so you'll need to specify all 11:21:46 the raw details. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 More commonly, it's appropriate to use a convenience method 11:21:46 such as :meth:`request`. 11:21:46 11:21:46 .. note:: 11:21:46 11:21:46 `release_conn` will only behave as expected if 11:21:46 `preload_content=False` because we want to make 11:21:46 `preload_content=False` the default behaviour someday soon without 11:21:46 breaking backwards compatibility. 11:21:46 11:21:46 :param method: 11:21:46 HTTP request method (such as GET, POST, PUT, etc.) 11:21:46 11:21:46 :param url: 11:21:46 The URL to perform the request on. 11:21:46 11:21:46 :param body: 11:21:46 Data to send in the request body, either :class:`str`, :class:`bytes`, 11:21:46 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 11:21:46 11:21:46 :param headers: 11:21:46 Dictionary of custom headers to send, such as User-Agent, 11:21:46 If-None-Match, etc. If None, pool headers are used. If provided, 11:21:46 these headers completely replace any pool-specific headers. 11:21:46 11:21:46 :param retries: 11:21:46 Configure the number of retries to allow before raising a 11:21:46 :class:`~urllib3.exceptions.MaxRetryError` exception. 11:21:46 11:21:46 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 11:21:46 :class:`~urllib3.util.retry.Retry` object for fine-grained control 11:21:46 over different types of retries. 11:21:46 Pass an integer number to retry connection errors that many times, 11:21:46 but no other types of errors. Pass zero to never retry. 11:21:46 11:21:46 If ``False``, then retries are disabled and any exception is raised 11:21:46 immediately. Also, instead of raising a MaxRetryError on redirects, 11:21:46 the redirect response will be returned. 11:21:46 11:21:46 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 11:21:46 11:21:46 :param redirect: 11:21:46 If True, automatically handle redirects (status codes 301, 302, 11:21:46 303, 307, 308). Each redirect counts as a retry. Disabling retries 11:21:46 will disable redirect, too. 11:21:46 11:21:46 :param assert_same_host: 11:21:46 If ``True``, will make sure that the host of the pool requests is 11:21:46 consistent else will raise HostChangedError. When ``False``, you can 11:21:46 use the pool on an HTTP proxy and request foreign hosts. 11:21:46 11:21:46 :param timeout: 11:21:46 If specified, overrides the default timeout for this one 11:21:46 request. It may be a float (in seconds) or an instance of 11:21:46 :class:`urllib3.util.Timeout`. 11:21:46 11:21:46 :param pool_timeout: 11:21:46 If set and the pool is set to block=True, then this method will 11:21:46 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 11:21:46 connection is available within the time period. 11:21:46 11:21:46 :param bool preload_content: 11:21:46 If True, the response's body will be preloaded into memory. 11:21:46 11:21:46 :param bool decode_content: 11:21:46 If True, will attempt to decode the body based on the 11:21:46 'content-encoding' header. 11:21:46 11:21:46 :param release_conn: 11:21:46 If False, then the urlopen call will not release the connection 11:21:46 back into the pool once a response is received (but will release if 11:21:46 you read the entire contents of the response such as when 11:21:46 `preload_content=True`). This is useful if you're not preloading 11:21:46 the response's content immediately. You will need to call 11:21:46 ``r.release_conn()`` on the response ``r`` to return the connection 11:21:46 back into the pool. If None, it takes the value of ``preload_content`` 11:21:46 which defaults to ``True``. 11:21:46 11:21:46 :param bool chunked: 11:21:46 If True, urllib3 will send the body using chunked transfer 11:21:46 encoding. Otherwise, urllib3 will send the body using the standard 11:21:46 content-length form. Defaults to False. 11:21:46 11:21:46 :param int body_pos: 11:21:46 Position to seek to in file-like body in the event of a retry or 11:21:46 redirect. Typically this won't need to be set because urllib3 will 11:21:46 auto-populate the value when needed. 11:21:46 """ 11:21:46 parsed_url = parse_url(url) 11:21:46 destination_scheme = parsed_url.scheme 11:21:46 11:21:46 if headers is None: 11:21:46 headers = self.headers 11:21:46 11:21:46 if not isinstance(retries, Retry): 11:21:46 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 11:21:46 11:21:46 if release_conn is None: 11:21:46 release_conn = preload_content 11:21:46 11:21:46 # Check host 11:21:46 if assert_same_host and not self.is_same_host(url): 11:21:46 raise HostChangedError(self, url, retries) 11:21:46 11:21:46 # Ensure that the URL we're connecting to is properly encoded 11:21:46 if url.startswith("/"): 11:21:46 url = to_str(_encode_target(url)) 11:21:46 else: 11:21:46 url = to_str(parsed_url.url) 11:21:46 11:21:46 conn = None 11:21:46 11:21:46 # Track whether `conn` needs to be released before 11:21:46 # returning/raising/recursing. Update this variable if necessary, and 11:21:46 # leave `release_conn` constant throughout the function. That way, if 11:21:46 # the function recurses, the original value of `release_conn` will be 11:21:46 # passed down into the recursive call, and its value will be respected. 11:21:46 # 11:21:46 # See issue #651 [1] for details. 11:21:46 # 11:21:46 # [1] 11:21:46 release_this_conn = release_conn 11:21:46 11:21:46 http_tunnel_required = connection_requires_http_tunnel( 11:21:46 self.proxy, self.proxy_config, destination_scheme 11:21:46 ) 11:21:46 11:21:46 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 11:21:46 # have to copy the headers dict so we can safely change it without those 11:21:46 # changes being reflected in anyone else's copy. 11:21:46 if not http_tunnel_required: 11:21:46 headers = headers.copy() # type: ignore[attr-defined] 11:21:46 headers.update(self.proxy_headers) # type: ignore[union-attr] 11:21:46 11:21:46 # Must keep the exception bound to a separate variable or else Python 3 11:21:46 # complains about UnboundLocalError. 11:21:46 err = None 11:21:46 11:21:46 # Keep track of whether we cleanly exited the except block. This 11:21:46 # ensures we do proper cleanup in finally. 11:21:46 clean_exit = False 11:21:46 11:21:46 # Rewind body position, if needed. Record current position 11:21:46 # for future rewinds in the event of a redirect/retry. 11:21:46 body_pos = set_file_position(body, body_pos) 11:21:46 11:21:46 try: 11:21:46 # Request a connection from the queue. 11:21:46 timeout_obj = self._get_timeout(timeout) 11:21:46 conn = self._get_conn(timeout=pool_timeout) 11:21:46 11:21:46 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 11:21:46 11:21:46 # Is this a closed/new connection that requires CONNECT tunnelling? 11:21:46 if self.proxy is not None and http_tunnel_required and conn.is_closed: 11:21:46 try: 11:21:46 self._prepare_proxy(conn) 11:21:46 except (BaseSSLError, OSError, SocketTimeout) as e: 11:21:46 self._raise_timeout( 11:21:46 err=e, url=self.proxy.url, timeout_value=conn.timeout 11:21:46 ) 11:21:46 raise 11:21:46 11:21:46 # If we're going to release the connection in ``finally:``, then 11:21:46 # the response doesn't need to know about the connection. Otherwise 11:21:46 # it will also try to release it and we'll have a double-release 11:21:46 # mess. 11:21:46 response_conn = conn if not release_conn else None 11:21:46 11:21:46 # Make the request on the HTTPConnection object 11:21:46 > response = self._make_request( 11:21:46 conn, 11:21:46 method, 11:21:46 url, 11:21:46 timeout=timeout_obj, 11:21:46 body=body, 11:21:46 headers=headers, 11:21:46 chunked=chunked, 11:21:46 retries=retries, 11:21:46 response_conn=response_conn, 11:21:46 preload_content=preload_content, 11:21:46 decode_content=decode_content, 11:21:46 **response_kw, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 11:21:46 conn.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 11:21:46 self.endheaders() 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 11:21:46 self._send_output(message_body, encode_chunked=encode_chunked) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 11:21:46 self.send(msg) 11:21:46 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 11:21:46 self.connect() 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 11:21:46 self.sock = self._new_conn() 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 11:21:46 def _new_conn(self) -> socket.socket: 11:21:46 """Establish a socket connection and set nodelay settings on it. 11:21:46 11:21:46 :return: New socket connection. 11:21:46 """ 11:21:46 try: 11:21:46 sock = connection.create_connection( 11:21:46 (self._dns_host, self.port), 11:21:46 self.timeout, 11:21:46 source_address=self.source_address, 11:21:46 socket_options=self.socket_options, 11:21:46 ) 11:21:46 except socket.gaierror as e: 11:21:46 raise NameResolutionError(self.host, self, e) from e 11:21:46 except SocketTimeout as e: 11:21:46 raise ConnectTimeoutError( 11:21:46 self, 11:21:46 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 11:21:46 ) from e 11:21:46 11:21:46 except OSError as e: 11:21:46 > raise NewConnectionError( 11:21:46 self, f"Failed to establish a new connection: {e}" 11:21:46 ) from e 11:21:46 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 11:21:46 11:21:46 The above exception was the direct cause of the following exception: 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 > resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 11:21:46 retries = retries.increment( 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 11:21:46 method = 'DELETE' 11:21:46 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01' 11:21:46 response = None 11:21:46 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 11:21:46 _pool = 11:21:46 _stacktrace = 11:21:46 11:21:46 def increment( 11:21:46 self, 11:21:46 method: str | None = None, 11:21:46 url: str | None = None, 11:21:46 response: BaseHTTPResponse | None = None, 11:21:46 error: Exception | None = None, 11:21:46 _pool: ConnectionPool | None = None, 11:21:46 _stacktrace: TracebackType | None = None, 11:21:46 ) -> Self: 11:21:46 """Return a new Retry object with incremented retry counters. 11:21:46 11:21:46 :param response: A response object, or None, if the server did not 11:21:46 return a response. 11:21:46 :type response: :class:`~urllib3.response.BaseHTTPResponse` 11:21:46 :param Exception error: An error encountered during the request, or 11:21:46 None if the response was received successfully. 11:21:46 11:21:46 :return: A new ``Retry`` object. 11:21:46 """ 11:21:46 if self.total is False and error: 11:21:46 # Disabled, indicate to re-raise the error. 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 11:21:46 total = self.total 11:21:46 if total is not None: 11:21:46 total -= 1 11:21:46 11:21:46 connect = self.connect 11:21:46 read = self.read 11:21:46 redirect = self.redirect 11:21:46 status_count = self.status 11:21:46 other = self.other 11:21:46 cause = "unknown" 11:21:46 status = None 11:21:46 redirect_location = None 11:21:46 11:21:46 if error and self._is_connection_error(error): 11:21:46 # Connect retry? 11:21:46 if connect is False: 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif connect is not None: 11:21:46 connect -= 1 11:21:46 11:21:46 elif error and self._is_read_error(error): 11:21:46 # Read retry? 11:21:46 if read is False or method is None or not self._is_method_retryable(method): 11:21:46 raise reraise(type(error), error, _stacktrace) 11:21:46 elif read is not None: 11:21:46 read -= 1 11:21:46 11:21:46 elif error: 11:21:46 # Other retry? 11:21:46 if other is not None: 11:21:46 other -= 1 11:21:46 11:21:46 elif response and response.get_redirect_location(): 11:21:46 # Redirect retry? 11:21:46 if redirect is not None: 11:21:46 redirect -= 1 11:21:46 cause = "too many redirects" 11:21:46 response_redirect_location = response.get_redirect_location() 11:21:46 if response_redirect_location: 11:21:46 redirect_location = response_redirect_location 11:21:46 status = response.status 11:21:46 11:21:46 else: 11:21:46 # Incrementing because of a server error like a 500 in 11:21:46 # status_forcelist and the given method is in the allowed_methods 11:21:46 cause = ResponseError.GENERIC_ERROR 11:21:46 if response and response.status: 11:21:46 if status_count is not None: 11:21:46 status_count -= 1 11:21:46 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 11:21:46 status = response.status 11:21:46 11:21:46 history = self.history + ( 11:21:46 RequestHistory(method, url, error, status, redirect_location), 11:21:46 ) 11:21:46 11:21:46 new_retry = self.new( 11:21:46 total=total, 11:21:46 connect=connect, 11:21:46 read=read, 11:21:46 redirect=redirect, 11:21:46 status=status_count, 11:21:46 other=other, 11:21:46 history=history, 11:21:46 ) 11:21:46 11:21:46 if new_retry.is_exhausted(): 11:21:46 reason = error or ResponseError(cause) 11:21:46 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 11:21:46 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 11:21:46 11:21:46 During handling of the above exception, another exception occurred: 11:21:46 11:21:46 self = 11:21:46 11:21:46 def test_54_disconnect_ROADMC(self): 11:21:46 > response = test_utils.unmount_device("ROADMC01") 11:21:46 11:21:46 transportpce_tests/1.2.1/test06_end2end.py:749: 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 transportpce_tests/common/test_utils.py:360: in unmount_device 11:21:46 response = delete_request(url[RESTCONF_VERSION].format('{}', node)) 11:21:46 transportpce_tests/common/test_utils.py:133: in delete_request 11:21:46 return requests.request( 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 11:21:46 return session.request(method=method, url=url, **kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 11:21:46 resp = self.send(prep, **send_kwargs) 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 11:21:46 r = adapter.send(request, **kwargs) 11:21:46 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:21:46 11:21:46 self = 11:21:46 request = , stream = False 11:21:46 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 11:21:46 proxies = OrderedDict() 11:21:46 11:21:46 def send( 11:21:46 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 11:21:46 ): 11:21:46 """Sends PreparedRequest object. Returns Response object. 11:21:46 11:21:46 :param request: The :class:`PreparedRequest ` being sent. 11:21:46 :param stream: (optional) Whether to stream the request content. 11:21:46 :param timeout: (optional) How long to wait for the server to send 11:21:46 data before giving up, as a float, or a :ref:`(connect timeout, 11:21:46 read timeout) ` tuple. 11:21:46 :type timeout: float or tuple or urllib3 Timeout object 11:21:46 :param verify: (optional) Either a boolean, in which case it controls whether 11:21:46 we verify the server's TLS certificate, or a string, in which case it 11:21:46 must be a path to a CA bundle to use 11:21:46 :param cert: (optional) Any user-provided SSL certificate to be trusted. 11:21:46 :param proxies: (optional) The proxies dictionary to apply to the request. 11:21:46 :rtype: requests.Response 11:21:46 """ 11:21:46 11:21:46 try: 11:21:46 conn = self.get_connection_with_tls_context( 11:21:46 request, verify, proxies=proxies, cert=cert 11:21:46 ) 11:21:46 except LocationValueError as e: 11:21:46 raise InvalidURL(e, request=request) 11:21:46 11:21:46 self.cert_verify(conn, request.url, verify, cert) 11:21:46 url = self.request_url(request, proxies) 11:21:46 self.add_headers( 11:21:46 request, 11:21:46 stream=stream, 11:21:46 timeout=timeout, 11:21:46 verify=verify, 11:21:46 cert=cert, 11:21:46 proxies=proxies, 11:21:46 ) 11:21:46 11:21:46 chunked = not (request.body is None or "Content-Length" in request.headers) 11:21:46 11:21:46 if isinstance(timeout, tuple): 11:21:46 try: 11:21:46 connect, read = timeout 11:21:46 timeout = TimeoutSauce(connect=connect, read=read) 11:21:46 except ValueError: 11:21:46 raise ValueError( 11:21:46 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 11:21:46 f"or a single float to set both timeouts to the same value." 11:21:46 ) 11:21:46 elif isinstance(timeout, TimeoutSauce): 11:21:46 pass 11:21:46 else: 11:21:46 timeout = TimeoutSauce(connect=timeout, read=timeout) 11:21:46 11:21:46 try: 11:21:46 resp = conn.urlopen( 11:21:46 method=request.method, 11:21:46 url=url, 11:21:46 body=request.body, 11:21:46 headers=request.headers, 11:21:46 redirect=False, 11:21:46 assert_same_host=False, 11:21:46 preload_content=False, 11:21:46 decode_content=False, 11:21:46 retries=self.max_retries, 11:21:46 timeout=timeout, 11:21:46 chunked=chunked, 11:21:46 ) 11:21:46 11:21:46 except (ProtocolError, OSError) as err: 11:21:46 raise ConnectionError(err, request=request) 11:21:46 11:21:46 except MaxRetryError as e: 11:21:46 if isinstance(e.reason, ConnectTimeoutError): 11:21:46 # TODO: Remove this in 3.0.0: see #2811 11:21:46 if not isinstance(e.reason, NewConnectionError): 11:21:46 raise ConnectTimeout(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, ResponseError): 11:21:46 raise RetryError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _ProxyError): 11:21:46 raise ProxyError(e, request=request) 11:21:46 11:21:46 if isinstance(e.reason, _SSLError): 11:21:46 # This branch is for urllib3 v1.22 and later. 11:21:46 raise SSLError(e, request=request) 11:21:46 11:21:46 > raise ConnectionError(e, request=request) 11:21:46 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 11:21:46 11:21:46 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 11:21:46 ----------------------------- Captured stdout call ----------------------------- 11:21:46 execution of test_54_disconnect_ROADMC 11:21:46 --------------------------- Captured stdout teardown --------------------------- 11:21:46 all processes killed 11:21:46 =========================== short test summary info ============================ 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_01_connect_xpdrA 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_02_connect_xpdrC 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_03_connect_rdmA 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_04_connect_rdmC 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_05_connect_xpdrA_N1_to_roadmA_PP1 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_06_connect_roadmA_PP1_to_xpdrA_N1 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_07_connect_xpdrC_N1_to_roadmC_PP1 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_08_connect_roadmC_PP1_to_xpdrC_N1 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_09_add_omsAttributes_ROADMA_ROADMC 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_10_add_omsAttributes_ROADMC_ROADMA 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_11_create_eth_service1 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_12_get_eth_service1 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_13_check_xc1_ROADMA 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_14_check_xc1_ROADMC 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_15_check_topo_XPDRA 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_16_check_topo_ROADMA_SRG1 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_17_check_topo_ROADMA_DEG1 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_18_connect_xpdrA_N2_to_roadmA_PP2 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_19_connect_roadmA_PP2_to_xpdrA_N2 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_20_connect_xpdrC_N2_to_roadmC_PP2 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_21_connect_roadmC_PP2_to_xpdrC_N2 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_22_create_eth_service2 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_23_get_eth_service2 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_24_check_xc2_ROADMA 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_25_check_topo_XPDRA 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_26_check_topo_ROADMA_SRG1 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_27_check_topo_ROADMA_DEG1 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_28_create_eth_service3 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_29_delete_eth_service3 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_30_delete_eth_service1 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_31_delete_eth_service2 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_32_check_no_xc_ROADMA 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_33_check_topo_XPDRA 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_34_check_topo_ROADMA_SRG1 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_35_check_topo_ROADMA_DEG1 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_36_create_oc_service1 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_37_get_oc_service1 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_38_check_xc1_ROADMA 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_39_check_xc1_ROADMC 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_40_create_oc_service2 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_41_get_oc_service2 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_42_check_xc2_ROADMA 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_43_check_topo_ROADMA 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_44_delete_oc_service1 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_45_delete_oc_service2 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_46_get_no_oc_services 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_47_get_no_xc_ROADMA 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_48_check_topo_ROADMA 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_49_loop_create_eth_service 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_50_loop_create_oc_service 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_51_disconnect_XPDRA 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_52_disconnect_XPDRC 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_53_disconnect_ROADMA 11:21:46 FAILED transportpce_tests/1.2.1/test06_end2end.py::TransportPCEFulltesting::test_54_disconnect_ROADMC 11:21:46 54 failed in 420.38s (0:07:00) 11:21:46 tests121: exit 1 (992.24 seconds) /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh 1.2.1 pid=35901 11:22:07 ................................... [100%] 11:22:46 35 passed in 74.41s (0:01:14) 11:22:46 pytest -q transportpce_tests/2.2.1/test02_topo_portmapping.py 11:23:17 ...... [100%] 11:26:30 6 passed in 223.81s (0:03:43) 11:26:30 pytest -q transportpce_tests/2.2.1/test03_topology.py 11:27:12 ............................................ [100%] 11:28:46 44 passed in 135.31s (0:02:15) 11:28:46 pytest -q transportpce_tests/2.2.1/test04_otn_topology.py 11:29:20 ............ [100%] 11:29:44 12 passed in 58.21s 11:29:44 pytest -q transportpce_tests/2.2.1/test05_flex_grid.py 11:30:09 ................ [100%] 11:31:38 16 passed in 112.90s (0:01:52) 11:31:38 pytest -q transportpce_tests/2.2.1/test06_renderer_service_path_nominal.py 11:32:05 ............................... [100%] 11:32:12 31 passed in 33.85s 11:32:12 pytest -q transportpce_tests/2.2.1/test07_otn_renderer.py 11:32:46 .......................... [100%] 11:33:41 26 passed in 89.57s (0:01:29) 11:33:41 pytest -q transportpce_tests/2.2.1/test08_otn_sh_renderer.py 11:34:16 ...................... [100%] 11:35:20 22 passed in 98.26s (0:01:38) 11:35:20 pytest -q transportpce_tests/2.2.1/test09_olm.py 11:36:00 ........................................ [100%] 11:38:21 40 passed in 181.18s (0:03:01) 11:38:21 pytest -q transportpce_tests/2.2.1/test11_otn_end2end.py 11:39:03 ........................................................................ [ 74%] 11:44:39 ......................... [100%] 11:46:31 97 passed in 489.26s (0:08:09) 11:46:31 pytest -q transportpce_tests/2.2.1/test12_end2end.py 11:47:09 ...................................................... [100%] 11:53:56 54 passed in 445.35s (0:07:25) 11:53:56 pytest -q transportpce_tests/2.2.1/test14_otn_switch_end2end.py 11:54:49 ........................................................................ [ 71%] 11:59:58 ............................. [100%] 12:02:07 101 passed in 490.23s (0:08:10) 12:02:07 pytest -q transportpce_tests/2.2.1/test15_otn_end2end_with_intermediate_switch.py 12:03:00 ........................................................................ [ 67%] 12:08:46 ................................... [100%] 12:15:07 107 passed in 779.61s (0:12:59) 12:15:07 tests121: FAIL ✖ in 16 minutes 38.34 seconds 12:15:07 tests221: OK ✔ in 53 minutes 41.14 seconds 12:15:07 tests_hybrid: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt 12:15:13 tests_hybrid: freeze> python -m pip freeze --all 12:15:13 tests_hybrid: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.3,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.3.1,pluggy==1.5.0,psutil==6.1.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.2.0,urllib3==2.2.3,wheel==0.44.0 12:15:13 tests_hybrid: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh hybrid 12:15:13 using environment variables from ./karaf121.env 12:15:13 pytest -q transportpce_tests/hybrid/test01_device_change_notifications.py 12:15:57 ................................................... [100%] 12:17:43 51 passed in 150.49s (0:02:30) 12:17:44 pytest -q transportpce_tests/hybrid/test02_B100G_end2end.py 12:18:25 ........................................................................ [ 66%] 12:22:45 ..................................... [100%] 12:24:51 109 passed in 427.49s (0:07:07) 12:24:51 pytest -q transportpce_tests/hybrid/test03_autonomous_reroute.py 12:25:37 ..................................................... [100%] 12:32:09 53 passed in 437.82s (0:07:17) 12:32:09 tests_hybrid: OK ✔ in 17 minutes 2.59 seconds 12:32:09 buildlighty: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt 12:32:15 buildlighty: freeze> python -m pip freeze --all 12:32:15 buildlighty: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.3,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.3.1,pluggy==1.5.0,psutil==6.1.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.2.0,urllib3==2.2.3,wheel==0.44.0 12:32:15 buildlighty: commands[0] /w/workspace/transportpce-tox-verify-scandium/lighty> ./build.sh 12:32:15 NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED 12:32:26 [ERROR] COMPILATION ERROR : 12:32:26 [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[17,42] cannot find symbol 12:32:26 symbol: class YangModuleInfo 12:32:26 location: package org.opendaylight.yangtools.binding 12:32:26 [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[21,30] cannot find symbol 12:32:26 symbol: class YangModuleInfo 12:32:26 location: class io.lighty.controllers.tpce.utils.TPCEUtils 12:32:26 [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[343,30] cannot find symbol 12:32:26 symbol: class YangModuleInfo 12:32:26 location: class io.lighty.controllers.tpce.utils.TPCEUtils 12:32:26 [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[350,23] cannot find symbol 12:32:26 symbol: class YangModuleInfo 12:32:26 location: class io.lighty.controllers.tpce.utils.TPCEUtils 12:32:26 [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.13.0:compile (default-compile) on project tpce: Compilation failure: Compilation failure: 12:32:26 [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[17,42] cannot find symbol 12:32:26 [ERROR] symbol: class YangModuleInfo 12:32:26 [ERROR] location: package org.opendaylight.yangtools.binding 12:32:26 [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[21,30] cannot find symbol 12:32:26 [ERROR] symbol: class YangModuleInfo 12:32:26 [ERROR] location: class io.lighty.controllers.tpce.utils.TPCEUtils 12:32:26 [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[343,30] cannot find symbol 12:32:26 [ERROR] symbol: class YangModuleInfo 12:32:26 [ERROR] location: class io.lighty.controllers.tpce.utils.TPCEUtils 12:32:26 [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[350,23] cannot find symbol 12:32:26 [ERROR] symbol: class YangModuleInfo 12:32:26 [ERROR] location: class io.lighty.controllers.tpce.utils.TPCEUtils 12:32:26 [ERROR] -> [Help 1] 12:32:26 [ERROR] 12:32:26 [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. 12:32:26 [ERROR] Re-run Maven using the -X switch to enable full debug logging. 12:32:26 [ERROR] 12:32:26 [ERROR] For more information about the errors and possible solutions, please read the following articles: 12:32:26 [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException 12:32:26 unzip: cannot find or open target/tpce-bin.zip, target/tpce-bin.zip.zip or target/tpce-bin.zip.ZIP. 12:32:26 buildlighty: exit 9 (10.43 seconds) /w/workspace/transportpce-tox-verify-scandium/lighty> ./build.sh pid=66835 12:32:26 buildlighty: command failed but is marked ignore outcome so handling it as success 12:32:26 buildcontroller: OK (102.78=setup[7.78]+cmd[94.99] seconds) 12:32:26 testsPCE: OK (318.30=setup[78.38]+cmd[239.92] seconds) 12:32:26 sims: OK (11.23=setup[7.67]+cmd[3.56] seconds) 12:32:26 build_karaf_tests121: OK (53.60=setup[7.50]+cmd[46.10] seconds) 12:32:26 tests121: FAIL code 1 (998.34=setup[6.10]+cmd[992.24] seconds) 12:32:26 build_karaf_tests221: OK (52.54=setup[7.50]+cmd[45.03] seconds) 12:32:26 tests_tapi: OK (821.70=setup[6.63]+cmd[815.07] seconds) 12:32:26 tests221: OK (3221.14=setup[6.26]+cmd[3214.88] seconds) 12:32:26 build_karaf_tests71: OK (62.42=setup[13.65]+cmd[48.77] seconds) 12:32:26 tests71: OK (420.72=setup[6.23]+cmd[414.49] seconds) 12:32:26 build_karaf_tests_hybrid: OK (56.81=setup[7.69]+cmd[49.12] seconds) 12:32:26 tests_hybrid: OK (1022.59=setup[6.10]+cmd[1016.49] seconds) 12:32:26 buildlighty: OK (16.53=setup[6.11]+cmd[10.43] seconds) 12:32:26 docs: OK (33.54=setup[30.96]+cmd[2.58] seconds) 12:32:26 docs-linkcheck: OK (35.11=setup[31.56]+cmd[3.55] seconds) 12:32:26 checkbashisms: OK (3.08=setup[2.17]+cmd[0.01,0.05,0.86] seconds) 12:32:26 pre-commit: OK (44.52=setup[3.32]+cmd[0.01,0.01,34.36,6.82] seconds) 12:32:26 pylint: OK (26.27=setup[4.64]+cmd[21.63] seconds) 12:32:26 evaluation failed :( (5659.14 seconds) 12:32:26 + tox_status=255 12:32:26 + echo '---> Completed tox runs' 12:32:26 ---> Completed tox runs 12:32:26 + for i in .tox/*/log 12:32:26 ++ echo .tox/build_karaf_tests121/log 12:32:26 ++ awk -F/ '{print $2}' 12:32:26 + tox_env=build_karaf_tests121 12:32:26 + cp -r .tox/build_karaf_tests121/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/build_karaf_tests121 12:32:26 + for i in .tox/*/log 12:32:26 ++ echo .tox/build_karaf_tests221/log 12:32:26 ++ awk -F/ '{print $2}' 12:32:26 + tox_env=build_karaf_tests221 12:32:26 + cp -r .tox/build_karaf_tests221/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/build_karaf_tests221 12:32:26 + for i in .tox/*/log 12:32:26 ++ echo .tox/build_karaf_tests71/log 12:32:26 ++ awk -F/ '{print $2}' 12:32:26 + tox_env=build_karaf_tests71 12:32:26 + cp -r .tox/build_karaf_tests71/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/build_karaf_tests71 12:32:26 + for i in .tox/*/log 12:32:26 ++ echo .tox/build_karaf_tests_hybrid/log 12:32:26 ++ awk -F/ '{print $2}' 12:32:26 + tox_env=build_karaf_tests_hybrid 12:32:26 + cp -r .tox/build_karaf_tests_hybrid/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/build_karaf_tests_hybrid 12:32:26 + for i in .tox/*/log 12:32:26 ++ echo .tox/buildcontroller/log 12:32:26 ++ awk -F/ '{print $2}' 12:32:26 + tox_env=buildcontroller 12:32:26 + cp -r .tox/buildcontroller/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/buildcontroller 12:32:26 + for i in .tox/*/log 12:32:26 ++ echo .tox/buildlighty/log 12:32:26 ++ awk -F/ '{print $2}' 12:32:26 + tox_env=buildlighty 12:32:26 + cp -r .tox/buildlighty/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/buildlighty 12:32:26 + for i in .tox/*/log 12:32:26 ++ echo .tox/checkbashisms/log 12:32:26 ++ awk -F/ '{print $2}' 12:32:26 + tox_env=checkbashisms 12:32:26 + cp -r .tox/checkbashisms/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/checkbashisms 12:32:26 + for i in .tox/*/log 12:32:26 ++ echo .tox/docs-linkcheck/log 12:32:26 ++ awk -F/ '{print $2}' 12:32:26 + tox_env=docs-linkcheck 12:32:26 + cp -r .tox/docs-linkcheck/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/docs-linkcheck 12:32:26 + for i in .tox/*/log 12:32:26 ++ echo .tox/docs/log 12:32:26 ++ awk -F/ '{print $2}' 12:32:26 + tox_env=docs 12:32:26 + cp -r .tox/docs/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/docs 12:32:26 + for i in .tox/*/log 12:32:26 ++ echo .tox/pre-commit/log 12:32:26 ++ awk -F/ '{print $2}' 12:32:26 + tox_env=pre-commit 12:32:26 + cp -r .tox/pre-commit/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/pre-commit 12:32:26 + for i in .tox/*/log 12:32:26 ++ echo .tox/pylint/log 12:32:26 ++ awk -F/ '{print $2}' 12:32:26 + tox_env=pylint 12:32:26 + cp -r .tox/pylint/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/pylint 12:32:26 + for i in .tox/*/log 12:32:26 ++ echo .tox/sims/log 12:32:26 ++ awk -F/ '{print $2}' 12:32:26 + tox_env=sims 12:32:26 + cp -r .tox/sims/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/sims 12:32:26 + for i in .tox/*/log 12:32:26 ++ echo .tox/tests121/log 12:32:26 ++ awk -F/ '{print $2}' 12:32:26 + tox_env=tests121 12:32:26 + cp -r .tox/tests121/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/tests121 12:32:26 + for i in .tox/*/log 12:32:26 ++ echo .tox/tests221/log 12:32:26 ++ awk -F/ '{print $2}' 12:32:26 + tox_env=tests221 12:32:26 + cp -r .tox/tests221/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/tests221 12:32:26 + for i in .tox/*/log 12:32:26 ++ echo .tox/tests71/log 12:32:26 ++ awk -F/ '{print $2}' 12:32:26 + tox_env=tests71 12:32:26 + cp -r .tox/tests71/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/tests71 12:32:26 + for i in .tox/*/log 12:32:26 ++ echo .tox/testsPCE/log 12:32:26 ++ awk -F/ '{print $2}' 12:32:26 + tox_env=testsPCE 12:32:26 + cp -r .tox/testsPCE/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/testsPCE 12:32:26 + for i in .tox/*/log 12:32:26 ++ echo .tox/tests_hybrid/log 12:32:26 ++ awk -F/ '{print $2}' 12:32:26 + tox_env=tests_hybrid 12:32:26 + cp -r .tox/tests_hybrid/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/tests_hybrid 12:32:26 + for i in .tox/*/log 12:32:26 ++ echo .tox/tests_tapi/log 12:32:26 ++ awk -F/ '{print $2}' 12:32:26 + tox_env=tests_tapi 12:32:26 + cp -r .tox/tests_tapi/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/tests_tapi 12:32:26 + DOC_DIR=docs/_build/html 12:32:26 + [[ -d docs/_build/html ]] 12:32:26 + echo '---> Archiving generated docs' 12:32:26 ---> Archiving generated docs 12:32:26 + mv docs/_build/html /w/workspace/transportpce-tox-verify-scandium/archives/docs 12:32:26 + echo '---> tox-run.sh ends' 12:32:26 ---> tox-run.sh ends 12:32:26 + test 255 -eq 0 12:32:26 + exit 255 12:32:26 ++ '[' 1 = 1 ']' 12:32:26 ++ '[' -x /usr/bin/clear_console ']' 12:32:26 ++ /usr/bin/clear_console -q 12:32:26 Build step 'Execute shell' marked build as failure 12:32:26 $ ssh-agent -k 12:32:26 unset SSH_AUTH_SOCK; 12:32:26 unset SSH_AGENT_PID; 12:32:26 echo Agent pid 12211 killed; 12:32:26 [ssh-agent] Stopped. 12:32:26 [PostBuildScript] - [INFO] Executing post build scripts. 12:32:26 [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins13965670242083456729.sh 12:32:26 ---> sysstat.sh 12:32:27 [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins16786263851868668635.sh 12:32:27 ---> package-listing.sh 12:32:27 ++ tr '[:upper:]' '[:lower:]' 12:32:27 ++ facter osfamily 12:32:27 + OS_FAMILY=debian 12:32:27 + workspace=/w/workspace/transportpce-tox-verify-scandium 12:32:27 + START_PACKAGES=/tmp/packages_start.txt 12:32:27 + END_PACKAGES=/tmp/packages_end.txt 12:32:27 + DIFF_PACKAGES=/tmp/packages_diff.txt 12:32:27 + PACKAGES=/tmp/packages_start.txt 12:32:27 + '[' /w/workspace/transportpce-tox-verify-scandium ']' 12:32:27 + PACKAGES=/tmp/packages_end.txt 12:32:27 + case "${OS_FAMILY}" in 12:32:27 + dpkg -l 12:32:27 + grep '^ii' 12:32:27 + '[' -f /tmp/packages_start.txt ']' 12:32:27 + '[' -f /tmp/packages_end.txt ']' 12:32:27 + diff /tmp/packages_start.txt /tmp/packages_end.txt 12:32:27 + '[' /w/workspace/transportpce-tox-verify-scandium ']' 12:32:27 + mkdir -p /w/workspace/transportpce-tox-verify-scandium/archives/ 12:32:27 + cp -f /tmp/packages_diff.txt /tmp/packages_end.txt /tmp/packages_start.txt /w/workspace/transportpce-tox-verify-scandium/archives/ 12:32:27 [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins11584610775380114531.sh 12:32:27 ---> capture-instance-metadata.sh 12:32:27 Setup pyenv: 12:32:27 system 12:32:27 3.8.13 12:32:27 3.9.13 12:32:27 3.10.13 12:32:27 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-scandium/.python-version) 12:32:27 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-KOUk from file:/tmp/.os_lf_venv 12:32:28 lf-activate-venv(): INFO: Installing: lftools 12:32:39 lf-activate-venv(): INFO: Adding /tmp/venv-KOUk/bin to PATH 12:32:39 INFO: Running in OpenStack, capturing instance metadata 12:32:39 [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins14227056327211597728.sh 12:32:39 provisioning config files... 12:32:40 Could not find credentials [logs] for transportpce-tox-verify-scandium #28 12:32:40 copy managed file [jenkins-log-archives-settings] to file:/w/workspace/transportpce-tox-verify-scandium@tmp/config11256712665243140851tmp 12:32:40 Regular expression run condition: Expression=[^.*logs-s3.*], Label=[odl-logs-s3-cloudfront-index] 12:32:40 Run condition [Regular expression match] enabling perform for step [Provide Configuration files] 12:32:40 provisioning config files... 12:32:40 copy managed file [jenkins-s3-log-ship] to file:/home/jenkins/.aws/credentials 12:32:40 [EnvInject] - Injecting environment variables from a build step. 12:32:40 [EnvInject] - Injecting as environment variables the properties content 12:32:40 SERVER_ID=logs 12:32:40 12:32:40 [EnvInject] - Variables injected successfully. 12:32:40 [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins5789454421954443223.sh 12:32:40 ---> create-netrc.sh 12:32:40 WARN: Log server credential not found. 12:32:40 [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins9452015267883165078.sh 12:32:40 ---> python-tools-install.sh 12:32:40 Setup pyenv: 12:32:40 system 12:32:40 3.8.13 12:32:40 3.9.13 12:32:40 3.10.13 12:32:40 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-scandium/.python-version) 12:32:40 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-KOUk from file:/tmp/.os_lf_venv 12:32:41 lf-activate-venv(): INFO: Installing: lftools 12:32:50 lf-activate-venv(): INFO: Adding /tmp/venv-KOUk/bin to PATH 12:32:50 [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins17915295326008152366.sh 12:32:50 ---> sudo-logs.sh 12:32:50 Archiving 'sudo' log.. 12:32:50 [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins16735518603099710476.sh 12:32:50 ---> job-cost.sh 12:32:50 Setup pyenv: 12:32:50 system 12:32:50 3.8.13 12:32:50 3.9.13 12:32:50 3.10.13 12:32:50 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-scandium/.python-version) 12:32:50 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-KOUk from file:/tmp/.os_lf_venv 12:32:51 lf-activate-venv(): INFO: Installing: zipp==1.1.0 python-openstackclient urllib3~=1.26.15 12:32:56 lf-activate-venv(): INFO: Adding /tmp/venv-KOUk/bin to PATH 12:32:56 INFO: No Stack... 12:32:56 INFO: Retrieving Pricing Info for: v3-standard-4 12:32:57 INFO: Archiving Costs 12:32:57 [transportpce-tox-verify-scandium] $ /bin/bash -l /tmp/jenkins10849005618523338915.sh 12:32:57 ---> logs-deploy.sh 12:32:57 Setup pyenv: 12:32:57 system 12:32:57 3.8.13 12:32:57 3.9.13 12:32:57 3.10.13 12:32:57 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-scandium/.python-version) 12:32:57 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-KOUk from file:/tmp/.os_lf_venv 12:32:58 lf-activate-venv(): INFO: Installing: lftools 12:33:06 lf-activate-venv(): INFO: Adding /tmp/venv-KOUk/bin to PATH 12:33:06 WARNING: Nexus logging server not set 12:33:06 INFO: S3 path logs/releng/vex-yul-odl-jenkins-1/transportpce-tox-verify-scandium/28/ 12:33:06 INFO: archiving logs to S3 12:33:08 ---> uname -a: 12:33:08 Linux prd-ubuntu2004-docker-4c-16g-2816 5.4.0-190-generic #210-Ubuntu SMP Fri Jul 5 17:03:38 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux 12:33:08 12:33:08 12:33:08 ---> lscpu: 12:33:08 Architecture: x86_64 12:33:08 CPU op-mode(s): 32-bit, 64-bit 12:33:08 Byte Order: Little Endian 12:33:08 Address sizes: 40 bits physical, 48 bits virtual 12:33:08 CPU(s): 4 12:33:08 On-line CPU(s) list: 0-3 12:33:08 Thread(s) per core: 1 12:33:08 Core(s) per socket: 1 12:33:08 Socket(s): 4 12:33:08 NUMA node(s): 1 12:33:08 Vendor ID: AuthenticAMD 12:33:08 CPU family: 23 12:33:08 Model: 49 12:33:08 Model name: AMD EPYC-Rome Processor 12:33:08 Stepping: 0 12:33:08 CPU MHz: 2800.000 12:33:08 BogoMIPS: 5600.00 12:33:08 Virtualization: AMD-V 12:33:08 Hypervisor vendor: KVM 12:33:08 Virtualization type: full 12:33:08 L1d cache: 128 KiB 12:33:08 L1i cache: 128 KiB 12:33:08 L2 cache: 2 MiB 12:33:08 L3 cache: 64 MiB 12:33:08 NUMA node0 CPU(s): 0-3 12:33:08 Vulnerability Gather data sampling: Not affected 12:33:08 Vulnerability Itlb multihit: Not affected 12:33:08 Vulnerability L1tf: Not affected 12:33:08 Vulnerability Mds: Not affected 12:33:08 Vulnerability Meltdown: Not affected 12:33:08 Vulnerability Mmio stale data: Not affected 12:33:08 Vulnerability Retbleed: Vulnerable 12:33:08 Vulnerability Spec store bypass: Mitigation; Speculative Store Bypass disabled via prctl and seccomp 12:33:08 Vulnerability Spectre v1: Mitigation; usercopy/swapgs barriers and __user pointer sanitization 12:33:08 Vulnerability Spectre v2: Mitigation; Retpolines; IBPB conditional; IBRS_FW; STIBP disabled; RSB filling; PBRSB-eIBRS Not affected; BHI Not affected 12:33:08 Vulnerability Srbds: Not affected 12:33:08 Vulnerability Tsx async abort: Not affected 12:33:08 Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy svm cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase tsc_adjust bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 clzero xsaveerptr wbnoinvd arat npt nrip_save umip rdpid arch_capabilities 12:33:08 12:33:08 12:33:08 ---> nproc: 12:33:08 4 12:33:08 12:33:08 12:33:08 ---> df -h: 12:33:08 Filesystem Size Used Avail Use% Mounted on 12:33:08 udev 7.8G 0 7.8G 0% /dev 12:33:08 tmpfs 1.6G 1.1M 1.6G 1% /run 12:33:08 /dev/vda1 78G 16G 62G 21% / 12:33:08 tmpfs 7.9G 0 7.9G 0% /dev/shm 12:33:08 tmpfs 5.0M 0 5.0M 0% /run/lock 12:33:08 tmpfs 7.9G 0 7.9G 0% /sys/fs/cgroup 12:33:08 /dev/loop0 62M 62M 0 100% /snap/core20/1405 12:33:08 /dev/loop2 44M 44M 0 100% /snap/snapd/15177 12:33:08 /dev/loop1 68M 68M 0 100% /snap/lxd/22753 12:33:08 /dev/vda15 105M 6.1M 99M 6% /boot/efi 12:33:08 tmpfs 1.6G 0 1.6G 0% /run/user/1001 12:33:08 /dev/loop3 64M 64M 0 100% /snap/core20/2434 12:33:08 /dev/loop4 92M 92M 0 100% /snap/lxd/29619 12:33:08 12:33:08 12:33:08 ---> free -m: 12:33:08 total used free shared buff/cache available 12:33:08 Mem: 15997 735 6791 1 8469 14922 12:33:08 Swap: 1023 0 1023 12:33:08 12:33:08 12:33:08 ---> ip addr: 12:33:08 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 12:33:08 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 12:33:08 inet 127.0.0.1/8 scope host lo 12:33:08 valid_lft forever preferred_lft forever 12:33:08 inet6 ::1/128 scope host 12:33:08 valid_lft forever preferred_lft forever 12:33:08 2: ens3: mtu 1458 qdisc mq state UP group default qlen 1000 12:33:08 link/ether fa:16:3e:cd:10:71 brd ff:ff:ff:ff:ff:ff 12:33:08 inet 10.30.170.113/23 brd 10.30.171.255 scope global dynamic ens3 12:33:08 valid_lft 80583sec preferred_lft 80583sec 12:33:08 inet6 fe80::f816:3eff:fecd:1071/64 scope link 12:33:08 valid_lft forever preferred_lft forever 12:33:08 3: docker0: mtu 1458 qdisc noqueue state DOWN group default 12:33:08 link/ether 02:42:95:78:34:39 brd ff:ff:ff:ff:ff:ff 12:33:08 inet 10.250.0.254/24 brd 10.250.0.255 scope global docker0 12:33:08 valid_lft forever preferred_lft forever 12:33:08 12:33:08 12:33:08 ---> sar -b -r -n DEV: 12:33:08 Linux 5.4.0-190-generic (prd-ubuntu2004-docker-4c-16g-2816) 10/30/24 _x86_64_ (4 CPU) 12:33:08 12:33:08 10:56:13 LINUX RESTART (4 CPU) 12:33:08 12:33:08 10:57:02 tps rtps wtps dtps bread/s bwrtn/s bdscd/s 12:33:08 10:58:01 154.15 61.41 92.75 0.00 1585.90 10963.95 0.00 12:33:08 10:59:01 246.89 41.01 205.88 0.00 2707.05 40161.39 0.00 12:33:08 11:00:01 84.55 2.00 82.55 0.00 129.31 32929.31 0.00 12:33:08 11:01:01 181.01 7.61 173.39 0.00 259.38 132929.02 0.00 12:33:08 11:02:01 203.50 9.58 193.92 0.00 4708.70 81582.81 0.00 12:33:08 11:03:01 121.69 1.78 119.91 0.00 52.92 7376.47 0.00 12:33:08 11:04:01 87.90 1.25 86.65 0.00 150.37 1694.92 0.00 12:33:08 11:05:01 143.61 0.17 143.44 0.00 25.73 2343.34 0.00 12:33:08 11:06:01 122.58 0.52 122.06 0.00 15.46 11680.32 0.00 12:33:08 11:07:01 91.52 0.00 91.52 0.00 0.00 1480.29 0.00 12:33:08 11:08:01 97.47 0.02 97.45 0.00 0.13 1967.94 0.00 12:33:08 11:09:01 54.62 0.07 54.56 0.00 2.80 796.67 0.00 12:33:08 11:10:01 32.44 2.40 30.04 0.00 433.19 667.38 0.00 12:33:08 11:11:01 159.64 0.00 159.64 0.00 0.00 3207.20 0.00 12:33:08 11:12:01 26.18 0.98 25.20 0.00 22.66 427.40 0.00 12:33:08 11:13:01 59.91 0.00 59.91 0.00 0.00 862.12 0.00 12:33:08 11:14:01 4.57 0.00 4.57 0.00 0.00 108.52 0.00 12:33:08 11:15:01 65.01 0.25 64.76 0.00 25.46 2681.15 0.00 12:33:08 11:16:01 143.59 0.08 143.51 0.00 2.67 9058.22 0.00 12:33:08 11:17:01 3.98 0.05 3.93 0.00 1.07 61.32 0.00 12:33:08 11:18:01 3.72 0.00 3.72 0.00 0.00 61.06 0.00 12:33:08 11:19:01 83.57 0.00 83.57 0.00 0.00 1405.90 0.00 12:33:08 11:20:01 2.23 0.00 2.23 0.00 0.00 43.33 0.00 12:33:08 11:21:01 86.15 0.00 86.15 0.00 0.00 1234.99 0.00 12:33:08 11:22:01 35.61 0.00 35.61 0.00 0.00 1448.43 0.00 12:33:08 11:23:01 75.24 0.00 75.24 0.00 0.00 2552.75 0.00 12:33:08 11:24:01 58.12 0.00 58.12 0.00 0.00 817.33 0.00 12:33:08 11:25:01 1.67 0.00 1.67 0.00 0.00 22.00 0.00 12:33:08 11:26:01 1.22 0.00 1.22 0.00 0.00 15.60 0.00 12:33:08 11:27:01 24.08 0.00 24.08 0.00 0.00 385.14 0.00 12:33:08 11:28:01 54.78 0.00 54.78 0.00 0.00 776.14 0.00 12:33:08 11:29:01 25.07 0.00 25.07 0.00 0.00 390.93 0.00 12:33:08 11:30:01 75.52 0.00 75.52 0.00 0.00 1296.98 0.00 12:33:08 11:31:01 60.16 0.00 60.16 0.00 0.00 846.93 0.00 12:33:08 11:32:01 17.14 0.00 17.14 0.00 0.00 279.64 0.00 12:33:08 11:33:01 143.67 0.00 143.67 0.00 0.00 2048.92 0.00 12:33:08 11:34:01 16.83 0.00 16.83 0.00 0.00 277.55 0.00 12:33:08 11:35:01 55.94 0.00 55.94 0.00 0.00 794.40 0.00 12:33:08 11:36:01 74.27 0.00 74.27 0.00 0.00 1084.75 0.00 12:33:08 11:37:01 2.32 0.00 2.32 0.00 0.00 52.38 0.00 12:33:08 11:38:01 1.95 0.00 1.95 0.00 0.00 46.39 0.00 12:33:08 11:39:01 71.32 0.00 71.32 0.00 0.00 1051.56 0.00 12:33:08 11:40:01 2.23 0.00 2.23 0.00 0.00 58.79 0.00 12:33:08 11:41:01 3.12 0.00 3.12 0.00 0.00 50.66 0.00 12:33:08 11:42:01 1.78 0.00 1.78 0.00 0.00 31.19 0.00 12:33:08 11:43:01 1.95 0.00 1.95 0.00 0.00 35.05 0.00 12:33:08 11:44:01 1.90 0.00 1.90 0.00 0.00 46.13 0.00 12:33:08 11:45:01 2.45 0.00 2.45 0.00 0.00 44.53 0.00 12:33:08 11:46:01 1.42 0.00 1.42 0.00 0.00 26.53 0.00 12:33:08 11:47:01 26.67 0.00 26.67 0.00 0.00 434.12 0.00 12:33:08 11:48:01 61.81 0.00 61.81 0.00 0.00 891.58 0.00 12:33:08 11:49:01 3.10 0.00 3.10 0.00 0.00 57.19 0.00 12:33:08 11:50:01 2.47 0.00 2.47 0.00 0.00 61.46 0.00 12:33:08 11:51:01 2.55 0.00 2.55 0.00 0.00 45.99 0.00 12:33:08 11:52:01 1.75 0.00 1.75 0.00 0.00 44.92 0.00 12:33:08 11:53:01 2.07 0.00 2.07 0.00 0.00 45.99 0.00 12:33:08 11:54:01 15.71 0.00 15.71 0.00 0.00 279.29 0.00 12:33:08 11:55:01 74.44 0.00 74.44 0.00 0.00 1061.16 0.00 12:33:08 11:56:01 2.77 0.00 2.77 0.00 0.00 64.66 0.00 12:33:08 11:57:01 3.73 0.00 3.73 0.00 0.00 78.79 0.00 12:33:08 11:58:01 2.58 0.00 2.58 0.00 0.00 45.86 0.00 12:33:08 11:59:01 2.88 0.00 2.88 0.00 0.00 48.13 0.00 12:33:08 12:00:01 2.45 0.00 2.45 0.00 0.00 54.26 0.00 12:33:08 12:01:01 3.88 0.00 3.88 0.00 0.00 70.25 0.00 12:33:08 12:02:01 2.27 0.00 2.27 0.00 0.00 41.19 0.00 12:33:08 12:03:01 90.45 0.00 90.45 0.00 0.00 1309.38 0.00 12:33:08 12:04:01 2.90 0.00 2.90 0.00 0.00 63.06 0.00 12:33:08 12:05:01 4.08 0.00 4.08 0.00 0.00 89.05 0.00 12:33:08 12:06:01 2.52 0.00 2.52 0.00 0.00 48.26 0.00 12:33:08 12:07:01 2.88 0.00 2.88 0.00 0.00 47.19 0.00 12:33:08 12:08:01 1.95 0.00 1.95 0.00 0.00 36.66 0.00 12:33:08 12:09:01 2.37 0.00 2.37 0.00 0.00 43.99 0.00 12:33:08 12:10:01 1.72 0.00 1.72 0.00 0.00 32.13 0.00 12:33:08 12:11:01 1.80 0.00 1.80 0.00 0.00 36.13 0.00 12:33:08 12:12:01 2.37 0.00 2.37 0.00 0.00 49.06 0.00 12:33:08 12:13:01 2.65 0.00 2.65 0.00 0.00 35.46 0.00 12:33:08 12:14:01 1.33 0.00 1.33 0.00 0.00 18.80 0.00 12:33:08 12:15:02 2.08 0.00 2.08 0.00 0.00 29.06 0.00 12:33:08 12:16:01 121.42 0.02 121.40 0.00 0.14 10188.55 0.00 12:33:08 12:17:01 12.38 8.53 3.85 0.00 458.72 348.08 0.00 12:33:08 12:18:01 26.25 0.00 26.25 0.00 0.00 758.41 0.00 12:33:08 12:19:01 57.09 0.00 57.09 0.00 0.00 824.13 0.00 12:33:08 12:20:01 2.28 0.00 2.28 0.00 0.00 38.66 0.00 12:33:08 12:21:01 2.25 0.00 2.25 0.00 0.00 38.26 0.00 12:33:08 12:22:01 2.00 0.00 2.00 0.00 0.00 38.39 0.00 12:33:08 12:23:01 2.47 0.00 2.47 0.00 0.00 45.86 0.00 12:33:08 12:24:01 1.57 0.00 1.57 0.00 0.00 37.59 0.00 12:33:08 12:25:01 16.21 0.00 16.21 0.00 0.00 370.20 0.00 12:33:08 12:26:01 59.84 0.00 59.84 0.00 0.00 964.37 0.00 12:33:08 12:27:01 3.03 0.00 3.03 0.00 0.00 170.10 0.00 12:33:08 12:28:01 1.58 0.00 1.58 0.00 0.00 48.66 0.00 12:33:08 12:29:01 3.35 0.00 3.35 0.00 0.00 83.72 0.00 12:33:08 12:30:01 1.67 0.00 1.67 0.00 0.00 22.80 0.00 12:33:08 12:31:01 1.83 0.00 1.83 0.00 0.00 24.26 0.00 12:33:08 12:32:01 1.37 0.00 1.37 0.00 0.00 17.20 0.00 12:33:08 12:33:01 54.97 10.48 44.49 0.00 323.15 6639.03 0.00 12:33:08 Average: 39.26 1.53 37.73 0.00 113.37 4019.29 0.00 12:33:08 12:33:08 10:57:02 kbmemfree kbavail kbmemused %memused kbbuffers kbcached kbcommit %commit kbactive kbinact kbdirty 12:33:08 10:58:01 13242764 15421644 550764 3.36 75712 2291492 1281832 7.35 816684 2029052 186168 12:33:08 10:59:01 10525564 14351620 1606744 9.81 137832 3754468 2321052 13.32 2079428 3346528 662492 12:33:08 11:00:01 9363944 14896352 1061188 6.48 162080 5361156 2197896 12.61 1844332 4667764 1376376 12:33:08 11:01:01 5316380 13541600 2412760 14.73 201160 7894860 3460356 19.85 4150824 6280860 938224 12:33:08 11:02:01 3867552 13114904 2829848 17.27 225824 8855408 4259948 24.44 5016948 6808116 160232 12:33:08 11:03:01 166924 8945920 6996908 42.71 225272 8397964 8220924 47.17 9060492 6461316 192 12:33:08 11:04:01 443904 9097400 6844400 41.78 228304 8271432 8128052 46.63 8935356 6311448 1564 12:33:08 11:05:01 5447988 13983308 1961124 11.97 233052 8149860 2774988 15.92 4063860 6198944 992 12:33:08 11:06:01 1873384 10653568 5289180 32.29 242144 8375528 7105040 40.76 7442080 6373796 1480 12:33:08 11:07:01 1548528 10332484 5609732 34.24 245212 8376116 6497264 37.28 7771412 6366132 224 12:33:08 11:08:01 1899144 10686416 5256172 32.09 247208 8377316 6582500 37.77 7451376 6335192 732 12:33:08 11:09:01 169536 8138376 7802796 47.63 246908 7570764 8871976 50.90 9907012 5612168 172 12:33:08 11:10:01 2089620 9886040 6056104 36.97 247376 7400760 7919160 45.43 8171972 5436076 1460 12:33:08 11:11:01 1779980 9580448 6361548 38.83 250516 7401636 7293348 41.84 8482232 5435924 492 12:33:08 11:12:01 2536364 10338576 5603548 34.21 250828 7402944 6746652 38.71 7729076 5433840 544 12:33:08 11:13:01 173896 7806844 8134016 49.65 251564 7235844 9063916 52.00 10234084 5285520 400 12:33:08 11:14:01 166912 7661816 8279068 50.54 251580 7099804 9112148 52.28 10367184 5161824 320 12:33:08 11:15:01 5319248 13053364 2890888 17.65 258592 7325256 3681520 21.12 5057380 5335060 206692 12:33:08 11:16:01 3012180 10749220 5193284 31.70 260292 7326376 6526104 37.44 7379436 5308200 376 12:33:08 11:17:01 2976336 10713536 5229068 31.92 260304 7326508 6558180 37.63 7415112 5308044 388 12:33:08 11:18:01 4602928 12340272 3603004 21.99 260320 7326612 5294296 30.37 5806144 5299368 200 12:33:08 11:19:01 2309372 10047700 5894460 35.98 261044 7326840 7266296 41.69 8095560 5296924 72 12:33:08 11:20:01 2272960 10011536 5930612 36.20 261056 7327076 7282320 41.78 8130488 5297156 120 12:33:08 11:21:01 3080232 10820284 5122216 31.27 262260 7327288 6494260 37.26 7324472 5296448 224 12:33:08 11:22:01 5111168 12919292 3024460 18.46 264544 7389076 4236632 24.31 5240396 5356120 44852 12:33:08 11:23:01 6479652 14288664 1655876 10.11 265036 7389440 2796892 16.05 3881992 5351380 532 12:33:08 11:24:01 5388364 13198012 2746156 16.76 265512 7389568 3631648 20.84 4968656 5351504 72 12:33:08 11:25:01 5388012 13197668 2746528 16.77 265516 7389572 3631648 20.84 4968956 5351508 56 12:33:08 11:26:01 5388044 13197708 2746476 16.77 265524 7389572 3631648 20.84 4968796 5351508 36 12:33:08 11:27:01 4902716 12712688 3231244 19.73 265560 7389840 4862108 27.90 5460748 5343344 424 12:33:08 11:28:01 3900684 11711424 4232080 25.83 266040 7390120 5103876 29.28 6457972 5343624 96 12:33:08 11:29:01 6353240 14164224 1780448 10.87 266104 7390300 2915960 16.73 4017404 5342756 600 12:33:08 11:30:01 6419880 14231832 1712792 10.46 266712 7390628 2860520 16.41 3950216 5342576 576 12:33:08 11:31:01 5958148 13770868 2173612 13.27 267272 7390836 2975424 17.07 4409548 5342780 72 12:33:08 11:32:01 5521476 13334504 2609816 15.93 267292 7391120 3682356 21.13 4845752 5342772 556 12:33:08 11:33:01 5756192 13570200 2374252 14.49 267756 7391628 3176176 18.22 4610800 5343244 84 12:33:08 11:34:01 5987460 13801724 2142608 13.08 267768 7391868 3515340 20.17 4379380 5343484 480 12:33:08 11:35:01 5193212 13008152 2935776 17.92 268140 7392160 3724428 21.37 5169860 5343776 220 12:33:08 11:36:01 4236792 12052532 3890996 23.75 268680 7392392 5048468 28.96 6123436 5343924 252 12:33:08 11:37:01 3686140 11502312 4440828 27.11 268680 7392808 5216644 29.93 6670836 5344320 464 12:33:08 11:38:01 3658144 11474616 4468516 27.28 268692 7393096 5248764 30.11 6698100 5344608 232 12:33:08 11:39:01 4377308 12194116 3749516 22.89 268856 7393248 4933452 28.30 5983480 5344660 212 12:33:08 11:40:01 3749304 11566664 4376428 26.72 268856 7393800 5148008 29.54 6607124 5345208 356 12:33:08 11:41:01 3731964 11549496 4393580 26.82 268856 7393984 5148008 29.54 6624028 5345380 284 12:33:08 11:42:01 3709488 11527128 4415944 26.96 268856 7394080 5148008 29.54 6646304 5345488 292 12:33:08 11:43:01 3692368 11510268 4432896 27.06 268856 7394336 5164040 29.63 6662576 5345744 532 12:33:08 11:44:01 3679012 11497296 4445856 27.14 268860 7394716 5180812 29.72 6675212 5346124 328 12:33:08 11:45:01 3648056 11466536 4476604 27.33 268860 7394924 5197008 29.82 6706680 5346316 420 12:33:08 11:46:01 3614184 11432924 4510104 27.53 268860 7395176 5213348 29.91 6739340 5346576 612 12:33:08 11:47:01 4659228 12477908 3465840 21.16 268876 7395080 4955784 28.43 5700328 5346380 140 12:33:08 11:48:01 3635272 11455044 4487984 27.40 269084 7395964 5301280 30.41 6718308 5347252 356 12:33:08 11:49:01 3617216 11437288 4505732 27.51 269084 7396268 5317324 30.51 6735300 5347552 44 12:33:08 11:50:01 3577652 11398328 4544680 27.74 269084 7396888 5333356 30.60 6774320 5348156 100 12:33:08 11:51:01 3557404 11378372 4564644 27.86 269084 7397164 5333356 30.60 6793036 5348448 128 12:33:08 11:52:01 3530820 11352308 4590692 28.02 269084 7397680 5333356 30.60 6820132 5348968 140 12:33:08 11:53:01 3489152 11311308 4631628 28.27 269084 7398348 5349372 30.69 6860724 5349636 456 12:33:08 11:54:01 7167680 14990336 954800 5.83 269092 7398840 1750840 10.05 3197480 5350096 464 12:33:08 11:55:01 2676188 10498732 5443996 33.23 269308 7398512 6460444 37.07 7675232 5349744 496 12:33:08 11:56:01 2368852 10191732 5750696 35.11 269316 7398840 6578172 37.74 7980900 5350068 36 12:33:08 11:57:01 2173832 9997316 5944904 36.29 269324 7399436 6676240 38.30 8174536 5350664 448 12:33:08 11:58:01 2173084 9996704 5945508 36.29 269328 7399568 6676240 38.30 8174864 5350796 60 12:33:08 11:59:01 2158688 9982476 5959724 36.38 269332 7399732 6692264 38.40 8188724 5350960 68 12:33:08 12:00:01 2134456 9958620 5983568 36.53 269340 7400100 6708272 38.49 8212336 5351328 164 12:33:08 12:01:01 2106380 9930956 6011184 36.70 269344 7400520 6724388 38.58 8239476 5351736 36 12:33:08 12:02:01 2077920 9902664 6039484 36.87 269344 7400676 6740648 38.67 8267512 5351904 124 12:33:08 12:03:01 3010988 10835860 5106956 31.18 269648 7400472 6422820 36.85 7335860 5351632 496 12:33:08 12:04:01 2397116 10222384 5720132 34.92 269648 7400864 6589396 37.81 7950644 5352012 268 12:33:08 12:05:01 2163080 9989064 5953084 36.34 269652 7401576 6738136 38.66 8182572 5352724 236 12:33:08 12:06:01 2149220 9975412 5966720 36.42 269652 7401784 6738136 38.66 8196764 5352932 304 12:33:08 12:07:01 2141164 9967460 5974672 36.47 269652 7401888 6738136 38.66 8202996 5353036 152 12:33:08 12:08:01 2124012 9950620 5991460 36.57 269656 7402196 6738136 38.66 8220384 5353344 344 12:33:08 12:09:01 2118064 9944884 5997184 36.61 269656 7402412 6738136 38.66 8226000 5353556 388 12:33:08 12:10:01 2112088 9939052 6003016 36.65 269660 7402548 6754368 38.75 8231700 5353696 108 12:33:08 12:11:01 2069020 9896272 6045712 36.91 269660 7402848 6804104 39.04 8272196 5353984 652 12:33:08 12:12:01 2041504 9868988 6073104 37.07 269660 7403068 6804104 39.04 8301492 5354216 184 12:33:08 12:13:01 2041160 9868652 6073460 37.08 269660 7403076 6820108 39.13 8300728 5354224 96 12:33:08 12:14:01 2040972 9868468 6073644 37.08 269664 7403076 6820108 39.13 8300868 5354224 80 12:33:08 12:15:02 2035324 9862864 6079220 37.11 269664 7403120 6820108 39.13 8307008 5354264 192 12:33:08 12:16:01 3654040 11721836 4221136 25.77 275980 7628800 5142712 29.51 6515404 5524180 2956 12:33:08 12:17:01 3280172 11369972 4570424 27.90 276416 7649116 5432564 31.17 6866824 5539264 1816 12:33:08 12:18:01 6022136 14112432 1829436 11.17 276444 7649488 2897944 16.63 4154896 5521040 1032 12:33:08 12:19:01 3375788 11466508 4473900 27.31 276608 7649696 5257316 30.16 6793512 5517804 116 12:33:08 12:20:01 3359316 11450200 4490204 27.41 276612 7649856 5273324 30.25 6810460 5517812 76 12:33:08 12:21:01 3337464 11428636 4511772 27.54 276624 7650132 5273324 30.25 6831816 5518076 344 12:33:08 12:22:01 3331156 11422492 4517948 27.58 276632 7650288 5273324 30.25 6837432 5518232 200 12:33:08 12:23:01 3324864 11416504 4523912 27.62 276632 7650588 5305684 30.44 6842380 5518536 196 12:33:08 12:24:01 3304720 11396680 4543740 27.74 276632 7650908 5305684 30.44 6863144 5518836 96 12:33:08 12:25:01 6671580 14764084 1178488 7.19 276644 7651328 2002580 11.49 3522476 5508284 416 12:33:08 12:26:01 2633768 10726252 5213844 31.83 276748 7651160 6033180 34.61 7546652 5508024 512 12:33:08 12:27:01 2508184 10601240 5338680 32.59 276748 7651720 6119256 35.11 7670180 5508440 468 12:33:08 12:28:01 2440704 10534544 5405356 33.00 276752 7652500 6119256 35.11 7736260 5509208 740 12:33:08 12:29:01 2402564 10497064 5442780 33.23 276756 7653144 6135244 35.20 7774596 5509856 128 12:33:08 12:30:01 2402164 10496676 5443264 33.23 276756 7653152 6135244 35.20 7775244 5509868 60 12:33:08 12:31:01 2401440 10495960 5443932 33.23 276756 7653160 6135244 35.20 7775136 5509876 72 12:33:08 12:32:01 2400984 10495504 5444388 33.24 276756 7653160 6135244 35.20 7775244 5509876 64 12:33:08 12:33:01 7026232 15320732 622080 3.80 281828 7834688 1390744 7.98 2998012 5666108 19592 12:33:08 Average: 3616057 11429390 4513870 27.55 260720 7404032 5449544 31.27 6685756 5405784 37796 12:33:08 12:33:08 10:57:02 IFACE rxpck/s txpck/s rxkB/s txkB/s rxcmp/s txcmp/s rxmcst/s %ifutil 12:33:08 10:58:01 ens3 94.03 73.17 924.86 10.03 0.00 0.00 0.00 0.00 12:33:08 10:58:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 10:58:01 lo 0.88 0.88 0.09 0.09 0.00 0.00 0.00 0.00 12:33:08 10:59:01 ens3 517.97 416.82 7649.65 44.99 0.00 0.00 0.00 0.00 12:33:08 10:59:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 10:59:01 lo 6.40 6.40 0.65 0.65 0.00 0.00 0.00 0.00 12:33:08 11:00:01 ens3 361.07 263.61 5442.29 26.08 0.00 0.00 0.00 0.00 12:33:08 11:00:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:00:01 lo 0.80 0.80 0.08 0.08 0.00 0.00 0.00 0.00 12:33:08 11:01:01 ens3 67.71 37.84 1783.03 6.17 0.00 0.00 0.00 0.00 12:33:08 11:01:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:01:01 lo 1.07 1.07 0.10 0.10 0.00 0.00 0.00 0.00 12:33:08 11:02:01 ens3 182.19 125.59 2104.01 8.82 0.00 0.00 0.00 0.00 12:33:08 11:02:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:02:01 lo 4.80 4.80 0.49 0.49 0.00 0.00 0.00 0.00 12:33:08 11:03:01 ens3 1.57 1.52 0.24 0.24 0.00 0.00 0.00 0.00 12:33:08 11:03:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:03:01 lo 21.81 21.81 27.46 27.46 0.00 0.00 0.00 0.00 12:33:08 11:04:01 ens3 1.50 1.67 0.26 0.29 0.00 0.00 0.00 0.00 12:33:08 11:04:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:04:01 lo 38.14 38.14 27.12 27.12 0.00 0.00 0.00 0.00 12:33:08 11:05:01 ens3 1.20 1.35 0.20 0.20 0.00 0.00 0.00 0.00 12:33:08 11:05:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:05:01 lo 35.64 35.64 17.33 17.33 0.00 0.00 0.00 0.00 12:33:08 11:06:01 ens3 2.13 2.52 0.82 0.75 0.00 0.00 0.00 0.00 12:33:08 11:06:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:06:01 lo 5.55 5.55 1.15 1.15 0.00 0.00 0.00 0.00 12:33:08 11:07:01 ens3 1.60 2.07 0.83 0.36 0.00 0.00 0.00 0.00 12:33:08 11:07:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:07:01 lo 29.58 29.58 28.57 28.57 0.00 0.00 0.00 0.00 12:33:08 11:08:01 ens3 1.20 1.10 0.17 0.16 0.00 0.00 0.00 0.00 12:33:08 11:08:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:08:01 lo 40.91 40.91 17.73 17.73 0.00 0.00 0.00 0.00 12:33:08 11:09:01 ens3 1.13 1.32 0.19 0.19 0.00 0.00 0.00 0.00 12:33:08 11:09:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:09:01 lo 29.30 29.30 13.38 13.38 0.00 0.00 0.00 0.00 12:33:08 11:10:01 ens3 1.62 2.00 0.31 0.32 0.00 0.00 0.00 0.00 12:33:08 11:10:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:10:01 lo 37.97 37.97 15.05 15.05 0.00 0.00 0.00 0.00 12:33:08 11:11:01 ens3 1.32 1.32 0.17 0.17 0.00 0.00 0.00 0.00 12:33:08 11:11:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:11:01 lo 24.56 24.56 11.47 11.47 0.00 0.00 0.00 0.00 12:33:08 11:12:01 ens3 1.58 1.98 0.36 0.36 0.00 0.00 0.00 0.00 12:33:08 11:12:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:12:01 lo 37.21 37.21 11.55 11.55 0.00 0.00 0.00 0.00 12:33:08 11:13:01 ens3 1.87 2.40 0.36 0.37 0.00 0.00 0.00 0.00 12:33:08 11:13:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:13:01 lo 42.78 42.78 22.09 22.09 0.00 0.00 0.00 0.00 12:33:08 11:14:01 ens3 1.33 1.72 0.26 0.27 0.00 0.00 0.00 0.00 12:33:08 11:14:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:14:01 lo 60.67 60.67 21.43 21.43 0.00 0.00 0.00 0.00 12:33:08 11:15:01 ens3 2.33 2.98 0.92 0.85 0.00 0.00 0.00 0.00 12:33:08 11:15:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:15:01 lo 20.18 20.18 5.88 5.88 0.00 0.00 0.00 0.00 12:33:08 11:16:01 ens3 2.48 3.05 0.55 0.48 0.00 0.00 0.00 0.00 12:33:08 11:16:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:16:01 lo 20.85 20.85 20.98 20.98 0.00 0.00 0.00 0.00 12:33:08 11:17:01 ens3 1.37 1.68 0.31 0.31 0.00 0.00 0.00 0.00 12:33:08 11:17:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:17:01 lo 24.90 24.90 8.23 8.23 0.00 0.00 0.00 0.00 12:33:08 11:18:01 ens3 1.23 1.27 0.24 0.24 0.00 0.00 0.00 0.00 12:33:08 11:18:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:18:01 lo 23.51 23.51 7.45 7.45 0.00 0.00 0.00 0.00 12:33:08 11:19:01 ens3 0.88 0.75 0.14 0.13 0.00 0.00 0.00 0.00 12:33:08 11:19:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:19:01 lo 18.05 18.05 11.36 11.36 0.00 0.00 0.00 0.00 12:33:08 11:20:01 ens3 1.08 0.97 0.22 0.20 0.00 0.00 0.00 0.00 12:33:08 11:20:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:20:01 lo 42.18 42.18 13.95 13.95 0.00 0.00 0.00 0.00 12:33:08 11:21:01 ens3 0.95 0.83 0.16 0.15 0.00 0.00 0.00 0.00 12:33:08 11:21:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:21:01 lo 26.58 26.58 11.44 11.44 0.00 0.00 0.00 0.00 12:33:08 11:22:01 ens3 37.53 32.13 8.36 23.03 0.00 0.00 0.00 0.00 12:33:08 11:22:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:22:01 lo 4.03 4.03 0.53 0.53 0.00 0.00 0.00 0.00 12:33:08 11:23:01 ens3 1.55 1.48 0.31 0.29 0.00 0.00 0.00 0.00 12:33:08 11:23:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:23:01 lo 15.88 15.88 10.16 10.16 0.00 0.00 0.00 0.00 12:33:08 11:24:01 ens3 0.35 0.25 0.05 0.04 0.00 0.00 0.00 0.00 12:33:08 11:24:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:24:01 lo 11.81 11.81 4.52 4.52 0.00 0.00 0.00 0.00 12:33:08 11:25:01 ens3 1.78 0.40 0.51 0.21 0.00 0.00 0.00 0.00 12:33:08 11:25:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:25:01 lo 0.20 0.20 0.01 0.01 0.00 0.00 0.00 0.00 12:33:08 11:26:01 ens3 0.57 0.23 0.27 0.20 0.00 0.00 0.00 0.00 12:33:08 11:26:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:26:01 lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:27:01 ens3 0.77 0.60 0.15 0.13 0.00 0.00 0.00 0.00 12:33:08 11:27:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:27:01 lo 1.70 1.70 0.15 0.15 0.00 0.00 0.00 0.00 12:33:08 11:28:01 ens3 1.03 0.83 0.24 0.17 0.00 0.00 0.00 0.00 12:33:08 11:28:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:28:01 lo 20.48 20.48 10.08 10.08 0.00 0.00 0.00 0.00 12:33:08 11:29:01 ens3 1.08 1.00 0.20 0.19 0.00 0.00 0.00 0.00 12:33:08 11:29:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:29:01 lo 11.13 11.13 3.96 3.96 0.00 0.00 0.00 0.00 12:33:08 11:30:01 ens3 1.20 0.87 0.39 0.32 0.00 0.00 0.00 0.00 12:33:08 11:30:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:30:01 lo 5.35 5.35 6.26 6.26 0.00 0.00 0.00 0.00 12:33:08 11:31:01 ens3 1.12 0.48 0.18 0.10 0.00 0.00 0.00 0.00 12:33:08 11:31:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:31:01 lo 8.48 8.48 3.16 3.16 0.00 0.00 0.00 0.00 12:33:08 11:32:01 ens3 1.70 0.85 0.50 0.36 0.00 0.00 0.00 0.00 12:33:08 11:32:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:32:01 lo 3.53 3.53 0.53 0.53 0.00 0.00 0.00 0.00 12:33:08 11:33:01 ens3 2.80 1.47 0.67 0.40 0.00 0.00 0.00 0.00 12:33:08 11:33:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:33:01 lo 33.67 33.67 16.43 16.43 0.00 0.00 0.00 0.00 12:33:08 11:34:01 ens3 1.88 1.42 0.71 0.57 0.00 0.00 0.00 0.00 12:33:08 11:34:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:34:01 lo 17.65 17.65 6.90 6.90 0.00 0.00 0.00 0.00 12:33:08 11:35:01 ens3 0.62 0.52 0.11 0.10 0.00 0.00 0.00 0.00 12:33:08 11:35:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:35:01 lo 20.33 20.33 20.14 20.14 0.00 0.00 0.00 0.00 12:33:08 11:36:01 ens3 0.92 0.75 0.14 0.12 0.00 0.00 0.00 0.00 12:33:08 11:36:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:36:01 lo 5.27 5.27 1.74 1.74 0.00 0.00 0.00 0.00 12:33:08 11:37:01 ens3 0.95 0.73 0.22 0.19 0.00 0.00 0.00 0.00 12:33:08 11:37:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:37:01 lo 45.02 45.02 17.29 17.29 0.00 0.00 0.00 0.00 12:33:08 11:38:01 ens3 0.63 0.57 0.13 0.12 0.00 0.00 0.00 0.00 12:33:08 11:38:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:38:01 lo 42.31 42.31 12.22 12.22 0.00 0.00 0.00 0.00 12:33:08 11:39:01 ens3 0.78 0.70 0.11 0.10 0.00 0.00 0.00 0.00 12:33:08 11:39:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:39:01 lo 6.47 6.47 1.68 1.68 0.00 0.00 0.00 0.00 12:33:08 11:40:01 ens3 0.78 0.60 0.14 0.12 0.00 0.00 0.00 0.00 12:33:08 11:40:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:40:01 lo 30.96 30.96 22.33 22.33 0.00 0.00 0.00 0.00 12:33:08 11:41:01 ens3 0.62 0.47 0.15 0.09 0.00 0.00 0.00 0.00 12:33:08 11:41:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:41:01 lo 10.43 10.43 6.73 6.73 0.00 0.00 0.00 0.00 12:33:08 11:42:01 ens3 3.52 5.53 0.45 7.50 0.00 0.00 0.00 0.00 12:33:08 11:42:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:42:01 lo 14.51 14.51 7.18 7.18 0.00 0.00 0.00 0.00 12:33:08 11:43:01 ens3 0.58 0.42 0.09 0.08 0.00 0.00 0.00 0.00 12:33:08 11:43:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:43:01 lo 23.23 23.23 8.67 8.67 0.00 0.00 0.00 0.00 12:33:08 11:44:01 ens3 0.72 0.37 0.12 0.07 0.00 0.00 0.00 0.00 12:33:08 11:44:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:44:01 lo 17.30 17.30 7.97 7.97 0.00 0.00 0.00 0.00 12:33:08 11:45:01 ens3 2.72 2.22 0.78 2.78 0.00 0.00 0.00 0.00 12:33:08 11:45:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:45:01 lo 19.38 19.38 8.05 8.05 0.00 0.00 0.00 0.00 12:33:08 11:46:01 ens3 0.60 0.58 0.09 0.10 0.00 0.00 0.00 0.00 12:33:08 11:46:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:46:01 lo 23.55 23.55 9.01 9.01 0.00 0.00 0.00 0.00 12:33:08 11:47:01 ens3 1.62 1.02 0.26 0.19 0.00 0.00 0.00 0.00 12:33:08 11:47:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:47:01 lo 5.00 5.00 1.46 1.46 0.00 0.00 0.00 0.00 12:33:08 11:48:01 ens3 1.05 1.28 0.41 0.37 0.00 0.00 0.00 0.00 12:33:08 11:48:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:48:01 lo 49.86 49.86 18.66 18.66 0.00 0.00 0.00 0.00 12:33:08 11:49:01 ens3 0.45 0.48 0.07 0.07 0.00 0.00 0.00 0.00 12:33:08 11:49:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:49:01 lo 26.13 26.13 7.43 7.43 0.00 0.00 0.00 0.00 12:33:08 11:50:01 ens3 0.62 0.60 0.10 0.10 0.00 0.00 0.00 0.00 12:33:08 11:50:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:50:01 lo 39.51 39.51 12.58 12.58 0.00 0.00 0.00 0.00 12:33:08 11:51:01 ens3 0.50 0.50 0.07 0.07 0.00 0.00 0.00 0.00 12:33:08 11:51:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:51:01 lo 25.11 25.11 7.19 7.19 0.00 0.00 0.00 0.00 12:33:08 11:52:01 ens3 0.32 0.23 0.09 0.08 0.00 0.00 0.00 0.00 12:33:08 11:52:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:52:01 lo 29.19 29.19 8.50 8.50 0.00 0.00 0.00 0.00 12:33:08 11:53:01 ens3 0.28 0.13 0.02 0.01 0.00 0.00 0.00 0.00 12:33:08 11:53:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:53:01 lo 53.16 53.16 15.97 15.97 0.00 0.00 0.00 0.00 12:33:08 11:54:01 ens3 0.42 0.43 0.07 0.07 0.00 0.00 0.00 0.00 12:33:08 11:54:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:54:01 lo 42.08 42.08 12.82 12.82 0.00 0.00 0.00 0.00 12:33:08 11:55:01 ens3 0.73 0.73 0.09 0.09 0.00 0.00 0.00 0.00 12:33:08 11:55:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:55:01 lo 12.46 12.46 18.17 18.17 0.00 0.00 0.00 0.00 12:33:08 11:56:01 ens3 0.75 0.92 0.14 0.15 0.00 0.00 0.00 0.00 12:33:08 11:56:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:56:01 lo 29.60 29.60 11.26 11.26 0.00 0.00 0.00 0.00 12:33:08 11:57:01 ens3 0.90 1.00 0.36 0.19 0.00 0.00 0.00 0.00 12:33:08 11:57:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:57:01 lo 31.11 31.11 14.42 14.42 0.00 0.00 0.00 0.00 12:33:08 11:58:01 ens3 0.58 0.62 0.10 0.10 0.00 0.00 0.00 0.00 12:33:08 11:58:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:58:01 lo 9.35 9.35 4.21 4.21 0.00 0.00 0.00 0.00 12:33:08 11:59:01 ens3 1.13 0.98 0.19 0.15 0.00 0.00 0.00 0.00 12:33:08 11:59:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 11:59:01 lo 24.15 24.15 9.75 9.75 0.00 0.00 0.00 0.00 12:33:08 12:00:01 ens3 2.28 0.80 0.77 0.50 0.00 0.00 0.00 0.00 12:33:08 12:00:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:00:01 lo 32.13 32.13 10.63 10.63 0.00 0.00 0.00 0.00 12:33:08 12:01:01 ens3 1.25 0.78 0.65 0.50 0.00 0.00 0.00 0.00 12:33:08 12:01:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:01:01 lo 21.88 21.88 9.72 9.72 0.00 0.00 0.00 0.00 12:33:08 12:02:01 ens3 0.85 0.65 0.20 0.18 0.00 0.00 0.00 0.00 12:33:08 12:02:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:02:01 lo 19.00 19.00 8.04 8.04 0.00 0.00 0.00 0.00 12:33:08 12:03:01 ens3 0.97 0.72 0.15 0.09 0.00 0.00 0.00 0.00 12:33:08 12:03:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:03:01 lo 4.20 4.20 5.42 5.42 0.00 0.00 0.00 0.00 12:33:08 12:04:01 ens3 1.40 0.67 0.21 0.14 0.00 0.00 0.00 0.00 12:33:08 12:04:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:04:01 lo 23.80 23.80 18.16 18.16 0.00 0.00 0.00 0.00 12:33:08 12:05:01 ens3 1.40 0.75 0.44 0.31 0.00 0.00 0.00 0.00 12:33:08 12:05:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:05:01 lo 43.48 43.48 19.70 19.70 0.00 0.00 0.00 0.00 12:33:08 12:06:01 ens3 2.00 0.92 0.74 0.51 0.00 0.00 0.00 0.00 12:33:08 12:06:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:06:01 lo 11.45 11.45 8.02 8.02 0.00 0.00 0.00 0.00 12:33:08 12:07:01 ens3 3.05 1.15 1.15 0.78 0.00 0.00 0.00 0.00 12:33:08 12:07:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:07:01 lo 14.53 14.53 6.53 6.53 0.00 0.00 0.00 0.00 12:33:08 12:08:01 ens3 1.07 0.88 0.42 0.35 0.00 0.00 0.00 0.00 12:33:08 12:08:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:08:01 lo 16.15 16.15 10.37 10.37 0.00 0.00 0.00 0.00 12:33:08 12:09:01 ens3 0.70 0.43 0.09 0.09 0.00 0.00 0.00 0.00 12:33:08 12:09:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:09:01 lo 11.21 11.21 5.93 5.93 0.00 0.00 0.00 0.00 12:33:08 12:10:01 ens3 0.57 0.47 0.11 0.09 0.00 0.00 0.00 0.00 12:33:08 12:10:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:10:01 lo 19.83 19.83 9.85 9.85 0.00 0.00 0.00 0.00 12:33:08 12:11:01 ens3 0.60 0.28 0.11 0.04 0.00 0.00 0.00 0.00 12:33:08 12:11:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:11:01 lo 25.61 25.61 11.70 11.70 0.00 0.00 0.00 0.00 12:33:08 12:12:01 ens3 0.50 0.37 0.14 0.12 0.00 0.00 0.00 0.00 12:33:08 12:12:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:12:01 lo 19.75 19.75 7.49 7.49 0.00 0.00 0.00 0.00 12:33:08 12:13:01 ens3 0.20 0.07 0.01 0.01 0.00 0.00 0.00 0.00 12:33:08 12:13:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:13:01 lo 0.83 0.83 0.07 0.07 0.00 0.00 0.00 0.00 12:33:08 12:14:01 ens3 0.13 0.00 0.01 0.00 0.00 0.00 0.00 0.00 12:33:08 12:14:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:14:01 lo 0.83 0.83 0.10 0.10 0.00 0.00 0.00 0.00 12:33:08 12:15:02 ens3 0.90 0.27 0.11 0.04 0.00 0.00 0.00 0.00 12:33:08 12:15:02 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:15:02 lo 1.78 1.78 0.16 0.16 0.00 0.00 0.00 0.00 12:33:08 12:16:01 ens3 2.20 2.32 1.11 0.96 0.00 0.00 0.00 0.00 12:33:08 12:16:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:16:01 lo 15.79 15.79 26.00 26.00 0.00 0.00 0.00 0.00 12:33:08 12:17:01 ens3 3.83 3.10 17.12 0.43 0.00 0.00 0.00 0.00 12:33:08 12:17:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:17:01 lo 35.31 35.31 14.14 14.14 0.00 0.00 0.00 0.00 12:33:08 12:18:01 ens3 1.72 1.02 0.69 0.52 0.00 0.00 0.00 0.00 12:33:08 12:18:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:18:01 lo 24.53 24.53 8.09 8.09 0.00 0.00 0.00 0.00 12:33:08 12:19:01 ens3 0.78 0.65 0.12 0.11 0.00 0.00 0.00 0.00 12:33:08 12:19:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:19:01 lo 33.03 33.03 17.38 17.38 0.00 0.00 0.00 0.00 12:33:08 12:20:01 ens3 0.90 0.72 0.16 0.14 0.00 0.00 0.00 0.00 12:33:08 12:20:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:20:01 lo 11.53 11.53 4.64 4.64 0.00 0.00 0.00 0.00 12:33:08 12:21:01 ens3 0.83 0.67 0.24 0.12 0.00 0.00 0.00 0.00 12:33:08 12:21:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:21:01 lo 26.60 26.60 10.38 10.38 0.00 0.00 0.00 0.00 12:33:08 12:22:01 ens3 1.03 0.83 0.24 0.22 0.00 0.00 0.00 0.00 12:33:08 12:22:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:22:01 lo 17.93 17.93 7.54 7.54 0.00 0.00 0.00 0.00 12:33:08 12:23:01 ens3 0.42 0.32 0.11 0.05 0.00 0.00 0.00 0.00 12:33:08 12:23:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:23:01 lo 27.76 27.76 8.64 8.64 0.00 0.00 0.00 0.00 12:33:08 12:24:01 ens3 0.67 0.48 0.12 0.10 0.00 0.00 0.00 0.00 12:33:08 12:24:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:24:01 lo 20.41 20.41 7.89 7.89 0.00 0.00 0.00 0.00 12:33:08 12:25:01 ens3 1.08 0.97 0.20 0.19 0.00 0.00 0.00 0.00 12:33:08 12:25:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:25:01 lo 37.56 37.56 12.13 12.13 0.00 0.00 0.00 0.00 12:33:08 12:26:01 ens3 0.83 0.73 0.13 0.12 0.00 0.00 0.00 0.00 12:33:08 12:26:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:26:01 lo 39.89 39.89 20.44 20.44 0.00 0.00 0.00 0.00 12:33:08 12:27:01 ens3 0.98 0.88 0.24 0.22 0.00 0.00 0.00 0.00 12:33:08 12:27:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:27:01 lo 40.61 40.61 15.11 15.11 0.00 0.00 0.00 0.00 12:33:08 12:28:01 ens3 0.77 0.10 0.09 0.02 0.00 0.00 0.00 0.00 12:33:08 12:28:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:28:01 lo 59.11 59.11 20.29 20.29 0.00 0.00 0.00 0.00 12:33:08 12:29:01 ens3 1.05 0.80 0.37 0.31 0.00 0.00 0.00 0.00 12:33:08 12:29:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:29:01 lo 74.70 74.70 24.72 24.72 0.00 0.00 0.00 0.00 12:33:08 12:30:01 ens3 0.78 0.27 0.30 0.20 0.00 0.00 0.00 0.00 12:33:08 12:30:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:30:01 lo 0.65 0.65 0.06 0.06 0.00 0.00 0.00 0.00 12:33:08 12:31:01 ens3 0.68 0.33 0.37 0.24 0.00 0.00 0.00 0.00 12:33:08 12:31:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:31:01 lo 0.97 0.97 0.10 0.10 0.00 0.00 0.00 0.00 12:33:08 12:32:01 ens3 0.78 0.13 0.14 0.07 0.00 0.00 0.00 0.00 12:33:08 12:32:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:32:01 lo 0.52 0.52 0.04 0.04 0.00 0.00 0.00 0.00 12:33:08 12:33:01 ens3 129.98 104.68 1554.10 18.58 0.00 0.00 0.00 0.00 12:33:08 12:33:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 12:33:01 lo 2.18 2.18 0.21 0.21 0.00 0.00 0.00 0.00 12:33:08 Average: ens3 15.56 11.89 203.15 1.74 0.00 0.00 0.00 0.00 12:33:08 Average: docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 12:33:08 Average: lo 21.87 21.87 9.92 9.92 0.00 0.00 0.00 0.00 12:33:08 12:33:08 12:33:08 ---> sar -P ALL: 12:33:08 Linux 5.4.0-190-generic (prd-ubuntu2004-docker-4c-16g-2816) 10/30/24 _x86_64_ (4 CPU) 12:33:08 12:33:08 10:56:13 LINUX RESTART (4 CPU) 12:33:08 12:33:08 10:57:02 CPU %user %nice %system %iowait %steal %idle 12:33:08 10:58:01 all 19.98 9.19 7.60 2.62 0.11 60.51 12:33:08 10:58:01 0 20.73 8.60 7.44 2.37 0.10 60.77 12:33:08 10:58:01 1 11.72 9.55 7.85 3.27 0.09 67.52 12:33:08 10:58:01 2 32.49 9.12 7.22 1.51 0.12 49.54 12:33:08 10:58:01 3 14.98 9.48 7.88 3.34 0.12 64.21 12:33:08 10:59:01 all 75.98 0.00 4.89 4.41 0.11 14.62 12:33:08 10:59:01 0 78.13 0.00 4.74 4.08 0.10 12.95 12:33:08 10:59:01 1 74.87 0.00 5.14 4.60 0.10 15.29 12:33:08 10:59:01 2 74.82 0.00 4.92 4.97 0.12 15.18 12:33:08 10:59:01 3 76.08 0.00 4.74 3.97 0.14 15.07 12:33:08 11:00:01 all 55.90 0.00 2.79 3.03 0.10 38.18 12:33:08 11:00:01 0 41.26 0.00 2.75 3.32 0.10 52.57 12:33:08 11:00:01 1 46.66 0.00 2.71 5.75 0.10 44.78 12:33:08 11:00:01 2 55.12 0.00 3.12 1.70 0.08 39.97 12:33:08 11:00:01 3 80.78 0.00 2.59 1.32 0.12 15.19 12:33:08 11:01:01 all 83.31 0.00 4.11 7.51 0.14 4.93 12:33:08 11:01:01 0 84.37 0.00 3.93 6.06 0.12 5.52 12:33:08 11:01:01 1 79.94 0.00 5.59 10.94 0.14 3.40 12:33:08 11:01:01 2 79.93 0.00 3.29 9.17 0.15 7.46 12:33:08 11:01:01 3 89.05 0.00 3.64 3.86 0.14 3.32 12:33:08 11:02:01 all 89.35 0.00 4.02 3.98 0.13 2.52 12:33:08 11:02:01 0 90.52 0.00 3.89 4.37 0.15 1.07 12:33:08 11:02:01 1 88.42 0.00 4.38 4.84 0.13 2.22 12:33:08 11:02:01 2 88.55 0.00 3.65 3.30 0.12 4.38 12:33:08 11:02:01 3 89.91 0.00 4.15 3.43 0.12 2.39 12:33:08 11:03:01 all 48.51 0.00 1.74 0.73 0.11 48.90 12:33:08 11:03:01 0 50.53 0.00 1.74 0.15 0.10 47.48 12:33:08 11:03:01 1 47.57 0.00 1.46 2.10 0.10 48.78 12:33:08 11:03:01 2 45.32 0.00 1.44 0.03 0.12 53.09 12:33:08 11:03:01 3 50.64 0.00 2.31 0.65 0.12 46.27 12:33:08 11:04:01 all 43.27 0.00 1.46 0.34 0.11 54.82 12:33:08 11:04:01 0 44.06 0.00 1.39 0.02 0.12 54.41 12:33:08 11:04:01 1 41.96 0.00 1.80 1.04 0.12 55.08 12:33:08 11:04:01 2 41.44 0.00 1.34 0.23 0.10 56.88 12:33:08 11:04:01 3 45.61 0.00 1.31 0.08 0.10 52.90 12:33:08 11:05:01 all 41.82 0.00 1.51 0.36 0.10 56.21 12:33:08 11:05:01 0 40.61 0.00 1.66 0.05 0.10 57.58 12:33:08 11:05:01 1 42.17 0.00 1.49 0.94 0.10 55.30 12:33:08 11:05:01 2 40.13 0.00 1.59 0.40 0.10 57.77 12:33:08 11:05:01 3 44.36 0.00 1.29 0.07 0.10 54.18 12:33:08 11:06:01 all 88.56 0.00 2.85 0.08 0.11 8.40 12:33:08 11:06:01 0 87.65 0.00 3.07 0.03 0.10 9.15 12:33:08 11:06:01 1 87.31 0.00 2.78 0.22 0.12 9.58 12:33:08 11:06:01 2 89.36 0.00 2.56 0.02 0.10 7.96 12:33:08 11:06:01 3 89.93 0.00 2.99 0.07 0.12 6.89 12:33:08 11:07:01 all 24.09 0.00 0.90 0.27 0.11 74.63 12:33:08 11:07:01 0 24.91 0.00 0.94 0.32 0.10 73.74 12:33:08 11:07:01 1 23.71 0.00 0.79 0.08 0.10 75.32 12:33:08 11:07:01 2 23.85 0.00 0.96 0.47 0.12 74.61 12:33:08 11:07:01 3 23.90 0.00 0.90 0.20 0.13 74.87 12:33:08 11:08:01 all 57.27 0.00 1.81 0.29 0.11 40.52 12:33:08 11:08:01 0 55.76 0.00 1.66 0.07 0.12 42.39 12:33:08 11:08:01 1 59.58 0.00 1.69 0.17 0.10 38.45 12:33:08 11:08:01 2 58.85 0.00 2.13 0.64 0.12 38.26 12:33:08 11:08:01 3 54.89 0.00 1.76 0.30 0.10 42.95 12:33:08 12:33:08 11:08:01 CPU %user %nice %system %iowait %steal %idle 12:33:08 11:09:01 all 28.07 0.00 1.07 0.23 0.11 70.52 12:33:08 11:09:01 0 31.64 0.00 1.62 0.07 0.12 66.56 12:33:08 11:09:01 1 28.75 0.00 0.87 0.20 0.12 70.06 12:33:08 11:09:01 2 27.78 0.00 0.84 0.49 0.10 70.80 12:33:08 11:09:01 3 24.09 0.00 0.94 0.15 0.10 74.71 12:33:08 11:10:01 all 29.11 0.00 1.24 0.05 0.11 69.49 12:33:08 11:10:01 0 28.66 0.00 1.49 0.02 0.10 69.74 12:33:08 11:10:01 1 27.38 0.00 1.29 0.13 0.12 71.08 12:33:08 11:10:01 2 30.56 0.00 1.43 0.07 0.12 67.83 12:33:08 11:10:01 3 29.83 0.00 0.76 0.00 0.10 69.31 12:33:08 11:11:01 all 85.42 0.00 2.57 0.29 0.11 11.61 12:33:08 11:11:01 0 86.07 0.00 2.36 0.08 0.12 11.37 12:33:08 11:11:01 1 84.93 0.00 2.55 0.02 0.12 12.39 12:33:08 11:11:01 2 87.37 0.00 2.76 0.27 0.10 9.50 12:33:08 11:11:01 3 83.33 0.00 2.61 0.80 0.10 13.16 12:33:08 11:12:01 all 30.28 0.00 1.11 0.04 0.12 68.46 12:33:08 11:12:01 0 30.11 0.00 1.16 0.02 0.12 68.60 12:33:08 11:12:01 1 29.66 0.00 0.97 0.02 0.12 69.23 12:33:08 11:12:01 2 30.67 0.00 1.31 0.07 0.12 67.84 12:33:08 11:12:01 3 30.66 0.00 1.00 0.05 0.12 68.17 12:33:08 11:13:01 all 31.62 0.00 1.16 0.26 0.10 66.86 12:33:08 11:13:01 0 30.42 0.00 1.12 0.17 0.12 68.18 12:33:08 11:13:01 1 34.24 0.00 1.60 0.25 0.10 63.81 12:33:08 11:13:01 2 32.34 0.00 0.92 0.15 0.10 66.49 12:33:08 11:13:01 3 29.47 0.00 1.02 0.47 0.10 68.95 12:33:08 11:14:01 all 8.45 0.00 0.49 0.02 0.10 90.95 12:33:08 11:14:01 0 8.80 0.00 0.59 0.00 0.10 90.52 12:33:08 11:14:01 1 10.05 0.00 0.45 0.00 0.10 89.40 12:33:08 11:14:01 2 7.65 0.00 0.47 0.08 0.10 91.69 12:33:08 11:14:01 3 7.27 0.00 0.45 0.00 0.08 92.20 12:33:08 11:15:01 all 46.44 0.00 1.83 0.23 0.10 51.40 12:33:08 11:15:01 0 50.99 0.00 1.78 0.05 0.10 47.08 12:33:08 11:15:01 1 41.07 0.00 1.99 0.20 0.10 56.65 12:33:08 11:15:01 2 45.41 0.00 2.09 0.59 0.12 51.79 12:33:08 11:15:01 3 48.30 0.00 1.45 0.10 0.10 50.05 12:33:08 11:16:01 all 61.05 0.00 2.01 0.41 0.11 36.41 12:33:08 11:16:01 0 61.94 0.00 2.31 0.10 0.10 35.55 12:33:08 11:16:01 1 60.63 0.00 1.84 1.42 0.15 35.96 12:33:08 11:16:01 2 59.92 0.00 1.97 0.08 0.10 37.93 12:33:08 11:16:01 3 61.72 0.00 1.92 0.05 0.10 36.21 12:33:08 11:17:01 all 4.78 0.00 0.54 0.02 0.11 94.55 12:33:08 11:17:01 0 4.65 0.00 0.65 0.03 0.12 94.55 12:33:08 11:17:01 1 4.76 0.00 0.45 0.03 0.12 94.64 12:33:08 11:17:01 2 5.60 0.00 0.53 0.00 0.10 93.77 12:33:08 11:17:01 3 4.09 0.00 0.55 0.02 0.10 95.24 12:33:08 11:18:01 all 3.60 0.00 0.58 0.03 0.10 95.68 12:33:08 11:18:01 0 4.29 0.00 0.73 0.00 0.10 94.88 12:33:08 11:18:01 1 3.53 0.00 0.59 0.00 0.07 95.82 12:33:08 11:18:01 2 3.47 0.00 0.47 0.08 0.12 95.86 12:33:08 11:18:01 3 3.12 0.00 0.54 0.03 0.13 96.18 12:33:08 11:19:01 all 37.67 0.00 1.15 0.35 0.11 60.72 12:33:08 11:19:01 0 37.11 0.00 1.32 1.29 0.12 60.16 12:33:08 11:19:01 1 38.80 0.00 1.22 0.07 0.12 59.80 12:33:08 11:19:01 2 39.48 0.00 1.24 0.00 0.10 59.18 12:33:08 11:19:01 3 35.30 0.00 0.83 0.03 0.12 63.71 12:33:08 12:33:08 11:19:01 CPU %user %nice %system %iowait %steal %idle 12:33:08 11:20:01 all 5.15 0.00 0.43 0.02 0.10 94.30 12:33:08 11:20:01 0 5.17 0.00 0.42 0.05 0.08 94.28 12:33:08 11:20:01 1 4.96 0.00 0.44 0.00 0.10 94.51 12:33:08 11:20:01 2 5.00 0.00 0.44 0.00 0.10 94.47 12:33:08 11:20:01 3 5.48 0.00 0.42 0.03 0.10 93.97 12:33:08 11:21:01 all 31.47 0.00 1.21 0.36 0.11 66.84 12:33:08 11:21:01 0 31.90 0.00 1.21 1.36 0.12 65.42 12:33:08 11:21:01 1 28.31 0.00 1.30 0.05 0.12 70.22 12:33:08 11:21:01 2 32.44 0.00 0.99 0.03 0.10 66.44 12:33:08 11:21:01 3 33.25 0.00 1.36 0.02 0.12 65.26 12:33:08 11:22:01 all 34.44 0.00 1.50 0.15 0.10 63.81 12:33:08 11:22:01 0 34.32 0.00 1.79 0.27 0.10 63.53 12:33:08 11:22:01 1 33.62 0.00 1.32 0.00 0.10 64.96 12:33:08 11:22:01 2 38.85 0.00 1.54 0.30 0.10 59.20 12:33:08 11:22:01 3 30.95 0.00 1.36 0.03 0.08 67.57 12:33:08 11:23:01 all 30.70 0.00 0.95 0.37 0.09 67.88 12:33:08 11:23:01 0 32.42 0.00 1.00 0.02 0.08 66.48 12:33:08 11:23:01 1 30.69 0.00 1.16 0.07 0.10 67.99 12:33:08 11:23:01 2 29.58 0.00 0.74 0.30 0.10 69.27 12:33:08 11:23:01 3 30.10 0.00 0.92 1.11 0.08 67.79 12:33:08 11:24:01 all 15.33 0.00 0.52 0.24 0.10 83.82 12:33:08 11:24:01 0 14.73 0.00 0.42 0.02 0.10 84.74 12:33:08 11:24:01 1 15.25 0.00 0.43 0.00 0.08 84.23 12:33:08 11:24:01 2 16.16 0.00 0.79 0.37 0.08 82.61 12:33:08 11:24:01 3 15.18 0.00 0.43 0.56 0.12 83.71 12:33:08 11:25:01 all 0.84 0.00 0.11 0.01 0.08 98.96 12:33:08 11:25:01 0 0.45 0.00 0.18 0.00 0.10 99.26 12:33:08 11:25:01 1 0.27 0.00 0.13 0.00 0.10 99.50 12:33:08 11:25:01 2 0.23 0.00 0.07 0.00 0.05 99.65 12:33:08 11:25:01 3 2.39 0.00 0.07 0.03 0.07 97.45 12:33:08 11:26:01 all 0.77 0.00 0.13 0.01 0.07 99.02 12:33:08 11:26:01 0 0.55 0.00 0.27 0.00 0.10 99.08 12:33:08 11:26:01 1 0.13 0.00 0.08 0.00 0.07 99.72 12:33:08 11:26:01 2 0.20 0.00 0.07 0.02 0.05 99.67 12:33:08 11:26:01 3 2.17 0.00 0.10 0.03 0.07 97.63 12:33:08 11:27:01 all 33.23 0.00 1.12 0.02 0.10 65.54 12:33:08 11:27:01 0 33.58 0.00 1.30 0.02 0.12 64.98 12:33:08 11:27:01 1 33.91 0.00 1.04 0.03 0.08 64.94 12:33:08 11:27:01 2 32.90 0.00 0.82 0.00 0.08 66.19 12:33:08 11:27:01 3 32.55 0.00 1.30 0.02 0.10 66.03 12:33:08 11:28:01 all 16.32 0.00 0.54 0.22 0.10 82.83 12:33:08 11:28:01 0 16.61 0.00 0.48 0.37 0.07 82.48 12:33:08 11:28:01 1 16.39 0.00 0.47 0.03 0.10 83.01 12:33:08 11:28:01 2 16.36 0.00 0.53 0.02 0.12 82.98 12:33:08 11:28:01 3 15.91 0.00 0.69 0.47 0.10 82.84 12:33:08 11:29:01 all 21.85 0.00 0.83 0.07 0.08 77.17 12:33:08 11:29:01 0 22.57 0.00 0.77 0.03 0.08 76.54 12:33:08 11:29:01 1 21.15 0.00 0.95 0.18 0.10 77.61 12:33:08 11:29:01 2 21.53 0.00 0.68 0.02 0.08 77.69 12:33:08 11:29:01 3 22.14 0.00 0.94 0.03 0.07 76.82 12:33:08 11:30:01 all 27.87 0.00 1.01 0.27 0.10 70.74 12:33:08 11:30:01 0 27.96 0.00 1.07 0.23 0.12 70.62 12:33:08 11:30:01 1 29.90 0.00 1.30 0.48 0.08 68.23 12:33:08 11:30:01 2 26.09 0.00 0.90 0.10 0.08 72.82 12:33:08 11:30:01 3 27.54 0.00 0.79 0.27 0.12 71.29 12:33:08 12:33:08 11:30:01 CPU %user %nice %system %iowait %steal %idle 12:33:08 11:31:01 all 8.83 0.00 0.47 0.26 0.08 90.37 12:33:08 11:31:01 0 9.79 0.00 0.59 0.16 0.08 89.38 12:33:08 11:31:01 1 8.60 0.00 0.52 0.78 0.08 90.02 12:33:08 11:31:01 2 8.75 0.00 0.30 0.02 0.10 90.83 12:33:08 11:31:01 3 8.16 0.00 0.45 0.07 0.07 91.26 12:33:08 11:32:01 all 27.41 0.00 1.04 0.02 0.09 71.44 12:33:08 11:32:01 0 26.89 0.00 0.80 0.02 0.10 72.19 12:33:08 11:32:01 1 26.20 0.00 1.21 0.05 0.10 72.45 12:33:08 11:32:01 2 28.43 0.00 1.05 0.00 0.07 70.45 12:33:08 11:32:01 3 28.12 0.00 1.10 0.02 0.08 70.68 12:33:08 11:33:01 all 40.26 0.00 1.23 0.58 0.10 57.82 12:33:08 11:33:01 0 41.54 0.00 1.34 0.42 0.10 56.60 12:33:08 11:33:01 1 40.17 0.00 1.44 1.89 0.10 56.39 12:33:08 11:33:01 2 40.75 0.00 1.09 0.00 0.10 58.06 12:33:08 11:33:01 3 38.60 0.00 1.06 0.02 0.10 60.22 12:33:08 11:34:01 all 27.40 0.00 0.90 0.02 0.08 71.60 12:33:08 11:34:01 0 27.50 0.00 0.74 0.03 0.07 71.66 12:33:08 11:34:01 1 27.08 0.00 0.98 0.03 0.08 71.82 12:33:08 11:34:01 2 27.03 0.00 1.00 0.00 0.10 71.86 12:33:08 11:34:01 3 27.99 0.00 0.87 0.00 0.08 71.06 12:33:08 11:35:01 all 15.83 0.00 0.50 0.26 0.08 83.33 12:33:08 11:35:01 0 16.16 0.00 0.43 0.00 0.10 83.31 12:33:08 11:35:01 1 15.55 0.00 0.57 0.60 0.07 83.21 12:33:08 11:35:01 2 15.36 0.00 0.55 0.41 0.08 83.60 12:33:08 11:35:01 3 16.26 0.00 0.43 0.02 0.08 83.20 12:33:08 11:36:01 all 42.40 0.00 1.34 0.31 0.10 55.84 12:33:08 11:36:01 0 37.95 0.00 1.01 0.02 0.10 60.92 12:33:08 11:36:01 1 45.70 0.00 1.84 1.19 0.10 51.18 12:33:08 11:36:01 2 41.39 0.00 1.42 0.03 0.10 57.06 12:33:08 11:36:01 3 44.59 0.00 1.10 0.02 0.10 54.19 12:33:08 11:37:01 all 13.65 0.00 0.52 0.01 0.09 85.73 12:33:08 11:37:01 0 13.66 0.00 0.38 0.00 0.10 85.86 12:33:08 11:37:01 1 13.30 0.00 0.52 0.05 0.08 86.05 12:33:08 11:37:01 2 14.35 0.00 0.69 0.00 0.10 84.86 12:33:08 11:37:01 3 13.28 0.00 0.48 0.00 0.08 86.16 12:33:08 11:38:01 all 4.54 0.00 0.33 0.01 0.09 95.03 12:33:08 11:38:01 0 3.89 0.00 0.28 0.00 0.08 95.74 12:33:08 11:38:01 1 4.46 0.00 0.27 0.03 0.10 95.14 12:33:08 11:38:01 2 5.38 0.00 0.35 0.02 0.08 94.18 12:33:08 11:38:01 3 4.43 0.00 0.42 0.00 0.08 95.07 12:33:08 11:39:01 all 40.17 0.00 1.48 0.29 0.09 57.97 12:33:08 11:39:01 0 42.35 0.00 1.55 0.07 0.08 55.96 12:33:08 11:39:01 1 40.64 0.00 1.61 0.99 0.10 56.66 12:33:08 11:39:01 2 37.56 0.00 1.20 0.05 0.10 61.10 12:33:08 11:39:01 3 40.15 0.00 1.57 0.05 0.08 58.14 12:33:08 11:40:01 all 16.27 0.00 0.48 0.02 0.10 83.14 12:33:08 11:40:01 0 16.11 0.00 0.43 0.00 0.08 83.37 12:33:08 11:40:01 1 15.83 0.00 0.50 0.05 0.08 83.53 12:33:08 11:40:01 2 16.35 0.00 0.50 0.00 0.10 83.05 12:33:08 11:40:01 3 16.77 0.00 0.50 0.02 0.12 82.60 12:33:08 11:41:01 all 3.06 0.00 0.27 0.02 0.08 96.58 12:33:08 11:41:01 0 2.57 0.00 0.25 0.00 0.07 97.12 12:33:08 11:41:01 1 2.60 0.00 0.28 0.03 0.10 96.98 12:33:08 11:41:01 2 4.17 0.00 0.25 0.05 0.07 95.47 12:33:08 11:41:01 3 2.88 0.00 0.28 0.00 0.08 96.76 12:33:08 12:33:08 11:41:01 CPU %user %nice %system %iowait %steal %idle 12:33:08 11:42:01 all 2.67 0.00 0.28 0.02 0.08 96.95 12:33:08 11:42:01 0 2.05 0.00 0.18 0.00 0.05 97.71 12:33:08 11:42:01 1 2.66 0.00 0.43 0.03 0.10 96.77 12:33:08 11:42:01 2 3.60 0.00 0.18 0.02 0.08 96.12 12:33:08 11:42:01 3 2.35 0.00 0.33 0.02 0.10 97.20 12:33:08 11:43:01 all 2.46 0.00 0.30 0.01 0.09 97.14 12:33:08 11:43:01 0 2.35 0.00 0.27 0.00 0.08 97.30 12:33:08 11:43:01 1 2.63 0.00 0.32 0.00 0.08 96.97 12:33:08 11:43:01 2 2.27 0.00 0.20 0.03 0.08 97.41 12:33:08 11:43:01 3 2.60 0.00 0.42 0.00 0.10 96.88 12:33:08 11:44:01 all 1.99 0.00 0.26 0.01 0.08 97.66 12:33:08 11:44:01 0 1.85 0.00 0.25 0.00 0.07 97.83 12:33:08 11:44:01 1 2.12 0.00 0.18 0.00 0.07 97.63 12:33:08 11:44:01 2 1.84 0.00 0.25 0.02 0.08 97.81 12:33:08 11:44:01 3 2.17 0.00 0.35 0.02 0.10 97.36 12:33:08 11:45:01 all 2.42 0.00 0.29 0.02 0.07 97.19 12:33:08 11:45:01 0 2.45 0.00 0.20 0.00 0.07 97.28 12:33:08 11:45:01 1 1.78 0.00 0.32 0.00 0.08 97.82 12:33:08 11:45:01 2 2.16 0.00 0.28 0.07 0.07 97.42 12:33:08 11:45:01 3 3.30 0.00 0.36 0.02 0.08 96.24 12:33:08 11:46:01 all 2.17 0.00 0.26 0.01 0.08 97.47 12:33:08 11:46:01 0 2.23 0.00 0.35 0.00 0.08 97.33 12:33:08 11:46:01 1 1.95 0.00 0.23 0.00 0.07 97.75 12:33:08 11:46:01 2 2.24 0.00 0.25 0.05 0.08 97.38 12:33:08 11:46:01 3 2.27 0.00 0.20 0.00 0.10 97.43 12:33:08 11:47:01 all 33.68 0.00 1.20 0.03 0.09 65.01 12:33:08 11:47:01 0 35.50 0.00 0.95 0.02 0.10 63.43 12:33:08 11:47:01 1 32.52 0.00 1.14 0.02 0.10 66.23 12:33:08 11:47:01 2 31.80 0.00 1.11 0.07 0.08 66.94 12:33:08 11:47:01 3 34.90 0.00 1.58 0.02 0.07 63.44 12:33:08 11:48:01 all 21.80 0.00 0.69 0.28 0.10 77.13 12:33:08 11:48:01 0 22.45 0.00 0.79 0.37 0.10 76.29 12:33:08 11:48:01 1 22.40 0.00 0.77 0.52 0.10 76.22 12:33:08 11:48:01 2 19.17 0.00 0.70 0.25 0.08 79.79 12:33:08 11:48:01 3 23.15 0.00 0.51 0.00 0.10 76.23 12:33:08 11:49:01 all 3.55 0.00 0.28 0.02 0.09 96.06 12:33:08 11:49:01 0 3.23 0.00 0.33 0.05 0.08 96.31 12:33:08 11:49:01 1 4.64 0.00 0.25 0.03 0.07 95.02 12:33:08 11:49:01 2 2.94 0.00 0.13 0.00 0.10 96.82 12:33:08 11:49:01 3 3.36 0.00 0.42 0.00 0.10 96.13 12:33:08 11:50:01 all 3.74 0.00 0.33 0.02 0.08 95.83 12:33:08 11:50:01 0 3.94 0.00 0.45 0.05 0.10 95.46 12:33:08 11:50:01 1 4.30 0.00 0.38 0.02 0.10 95.20 12:33:08 11:50:01 2 3.14 0.00 0.32 0.00 0.08 96.46 12:33:08 11:50:01 3 3.56 0.00 0.18 0.00 0.05 96.20 12:33:08 11:51:01 all 2.37 0.00 0.26 0.02 0.07 97.28 12:33:08 11:51:01 0 3.40 0.00 0.36 0.03 0.08 96.12 12:33:08 11:51:01 1 2.14 0.00 0.28 0.00 0.10 97.48 12:33:08 11:51:01 2 2.29 0.00 0.22 0.03 0.05 97.41 12:33:08 11:51:01 3 1.64 0.00 0.18 0.00 0.07 98.11 12:33:08 11:52:01 all 2.64 0.00 0.25 0.01 0.09 97.01 12:33:08 11:52:01 0 3.43 0.00 0.25 0.03 0.10 96.19 12:33:08 11:52:01 1 2.37 0.00 0.25 0.00 0.08 97.30 12:33:08 11:52:01 2 2.10 0.00 0.28 0.02 0.08 97.51 12:33:08 11:52:01 3 2.65 0.00 0.23 0.00 0.08 97.03 12:33:08 12:33:08 11:52:01 CPU %user %nice %system %iowait %steal %idle 12:33:08 11:53:01 all 3.56 0.00 0.33 0.02 0.10 95.99 12:33:08 11:53:01 0 3.49 0.00 0.35 0.05 0.08 96.02 12:33:08 11:53:01 1 3.34 0.00 0.30 0.00 0.10 96.26 12:33:08 11:53:01 2 3.98 0.00 0.40 0.00 0.12 95.50 12:33:08 11:53:01 3 3.44 0.00 0.27 0.02 0.08 96.19 12:33:08 11:54:01 all 9.72 0.00 0.64 0.03 0.08 89.53 12:33:08 11:54:01 0 10.14 0.00 0.59 0.02 0.08 89.17 12:33:08 11:54:01 1 9.42 0.00 0.79 0.00 0.08 89.71 12:33:08 11:54:01 2 9.55 0.00 0.59 0.00 0.08 89.78 12:33:08 11:54:01 3 9.77 0.00 0.58 0.10 0.07 89.48 12:33:08 11:55:01 all 52.06 0.00 1.46 0.28 0.10 46.10 12:33:08 11:55:01 0 53.93 0.00 2.31 0.65 0.12 43.00 12:33:08 11:55:01 1 45.86 0.00 1.39 0.38 0.12 52.25 12:33:08 11:55:01 2 51.89 0.00 0.87 0.02 0.10 47.12 12:33:08 11:55:01 3 56.62 0.00 1.27 0.05 0.08 41.98 12:33:08 11:56:01 all 9.05 0.00 0.40 0.01 0.09 90.44 12:33:08 11:56:01 0 7.82 0.00 0.48 0.03 0.10 91.57 12:33:08 11:56:01 1 10.03 0.00 0.36 0.00 0.10 89.51 12:33:08 11:56:01 2 9.56 0.00 0.28 0.02 0.07 90.08 12:33:08 11:56:01 3 8.80 0.00 0.47 0.00 0.10 90.63 12:33:08 11:57:01 all 8.11 0.00 0.36 0.02 0.10 91.42 12:33:08 11:57:01 0 7.52 0.00 0.40 0.03 0.08 91.96 12:33:08 11:57:01 1 8.87 0.00 0.40 0.00 0.12 90.61 12:33:08 11:57:01 2 8.74 0.00 0.30 0.03 0.08 90.84 12:33:08 11:57:01 3 7.29 0.00 0.33 0.00 0.10 92.28 12:33:08 11:58:01 all 2.36 0.00 0.27 0.03 0.10 97.26 12:33:08 11:58:01 0 2.47 0.00 0.33 0.05 0.08 97.06 12:33:08 11:58:01 1 2.66 0.00 0.28 0.00 0.12 96.94 12:33:08 11:58:01 2 2.16 0.00 0.23 0.05 0.10 97.46 12:33:08 11:58:01 3 2.14 0.00 0.22 0.00 0.08 97.56 12:33:08 11:59:01 all 3.10 0.00 0.31 0.03 0.08 96.47 12:33:08 11:59:01 0 3.04 0.00 0.27 0.02 0.07 96.61 12:33:08 11:59:01 1 3.46 0.00 0.38 0.03 0.10 96.02 12:33:08 11:59:01 2 2.67 0.00 0.32 0.08 0.10 96.83 12:33:08 11:59:01 3 3.24 0.00 0.27 0.00 0.07 96.43 12:33:08 12:00:01 all 3.25 0.00 0.30 0.02 0.08 96.34 12:33:08 12:00:01 0 2.74 0.00 0.32 0.05 0.10 96.79 12:33:08 12:00:01 1 4.24 0.00 0.36 0.00 0.08 95.32 12:33:08 12:00:01 2 2.82 0.00 0.25 0.03 0.08 96.81 12:33:08 12:00:01 3 3.20 0.00 0.27 0.00 0.07 96.47 12:33:08 12:01:01 all 2.92 0.00 0.32 0.02 0.09 96.66 12:33:08 12:01:01 0 2.64 0.00 0.27 0.02 0.08 96.99 12:33:08 12:01:01 1 3.07 0.00 0.43 0.02 0.10 96.38 12:33:08 12:01:01 2 3.56 0.00 0.36 0.05 0.10 95.93 12:33:08 12:01:01 3 2.40 0.00 0.20 0.00 0.07 97.33 12:33:08 12:02:01 all 2.13 0.00 0.28 0.02 0.09 97.49 12:33:08 12:02:01 0 1.99 0.00 0.33 0.05 0.10 97.52 12:33:08 12:02:01 1 1.96 0.00 0.25 0.00 0.10 97.69 12:33:08 12:02:01 2 2.54 0.00 0.33 0.03 0.08 97.01 12:33:08 12:02:01 3 2.02 0.00 0.20 0.00 0.07 97.71 12:33:08 12:03:01 all 52.66 0.00 1.69 0.28 0.10 45.27 12:33:08 12:03:01 0 52.26 0.00 1.66 0.60 0.10 45.37 12:33:08 12:03:01 1 51.37 0.00 1.69 0.08 0.10 46.76 12:33:08 12:03:01 2 52.03 0.00 1.79 0.02 0.08 46.08 12:33:08 12:03:01 3 54.99 0.00 1.62 0.43 0.10 42.85 12:33:08 12:33:08 12:03:01 CPU %user %nice %system %iowait %steal %idle 12:33:08 12:04:01 all 11.94 0.00 0.45 0.02 0.10 87.49 12:33:08 12:04:01 0 12.73 0.00 0.43 0.03 0.10 86.71 12:33:08 12:04:01 1 11.81 0.00 0.48 0.00 0.08 87.63 12:33:08 12:04:01 2 11.70 0.00 0.48 0.02 0.10 87.70 12:33:08 12:04:01 3 11.52 0.00 0.40 0.03 0.10 87.94 12:33:08 12:05:01 all 10.51 0.00 0.42 0.02 0.09 88.96 12:33:08 12:05:01 0 10.13 0.00 0.45 0.03 0.08 89.31 12:33:08 12:05:01 1 10.73 0.00 0.52 0.00 0.10 88.66 12:33:08 12:05:01 2 10.49 0.00 0.27 0.00 0.08 89.16 12:33:08 12:05:01 3 10.71 0.00 0.43 0.05 0.08 88.72 12:33:08 12:06:01 all 3.65 0.00 0.27 0.02 0.08 95.97 12:33:08 12:06:01 0 3.30 0.00 0.32 0.05 0.08 96.24 12:33:08 12:06:01 1 3.52 0.00 0.30 0.00 0.08 96.10 12:33:08 12:06:01 2 4.78 0.00 0.26 0.00 0.07 94.89 12:33:08 12:06:01 3 2.99 0.00 0.22 0.03 0.08 96.68 12:33:08 12:07:01 all 2.68 0.00 0.24 0.01 0.08 96.98 12:33:08 12:07:01 0 3.39 0.00 0.33 0.03 0.08 96.16 12:33:08 12:07:01 1 1.87 0.00 0.18 0.00 0.08 97.86 12:33:08 12:07:01 2 3.10 0.00 0.27 0.02 0.08 96.53 12:33:08 12:07:01 3 2.36 0.00 0.17 0.00 0.08 97.39 12:33:08 12:08:01 all 2.09 0.00 0.23 0.01 0.08 97.59 12:33:08 12:08:01 0 2.48 0.00 0.27 0.03 0.10 97.12 12:33:08 12:08:01 1 2.01 0.00 0.28 0.00 0.08 97.62 12:33:08 12:08:01 2 1.95 0.00 0.15 0.00 0.07 97.83 12:33:08 12:08:01 3 1.90 0.00 0.20 0.02 0.08 97.79 12:33:08 12:09:01 all 1.48 0.00 0.26 0.02 0.09 98.15 12:33:08 12:09:01 0 1.65 0.00 0.25 0.03 0.08 97.98 12:33:08 12:09:01 1 1.53 0.00 0.32 0.00 0.10 98.05 12:33:08 12:09:01 2 1.49 0.00 0.32 0.00 0.10 98.10 12:33:08 12:09:01 3 1.24 0.00 0.17 0.05 0.07 98.48 12:33:08 12:10:01 all 1.99 0.00 0.28 0.01 0.09 97.63 12:33:08 12:10:01 0 2.06 0.00 0.25 0.03 0.07 97.59 12:33:08 12:10:01 1 2.12 0.00 0.35 0.00 0.10 97.43 12:33:08 12:10:01 2 2.13 0.00 0.33 0.00 0.10 97.43 12:33:08 12:10:01 3 1.66 0.00 0.17 0.00 0.10 98.07 12:33:08 12:11:01 all 2.38 0.00 0.31 0.01 0.09 97.21 12:33:08 12:11:01 0 2.51 0.00 0.44 0.03 0.12 96.90 12:33:08 12:11:01 1 2.45 0.00 0.30 0.00 0.08 97.17 12:33:08 12:11:01 2 2.27 0.00 0.23 0.00 0.07 97.43 12:33:08 12:11:01 3 2.29 0.00 0.27 0.02 0.08 97.34 12:33:08 12:12:01 all 2.20 0.00 0.30 0.02 0.08 97.41 12:33:08 12:12:01 0 1.99 0.00 0.30 0.03 0.10 97.58 12:33:08 12:12:01 1 3.52 0.00 0.18 0.02 0.07 96.22 12:33:08 12:12:01 2 1.42 0.00 0.23 0.02 0.07 98.27 12:33:08 12:12:01 3 1.87 0.00 0.47 0.00 0.08 97.58 12:33:08 12:13:01 all 0.91 0.00 0.21 0.02 0.08 98.78 12:33:08 12:13:01 0 0.53 0.00 0.30 0.05 0.08 99.03 12:33:08 12:13:01 1 2.18 0.00 0.07 0.02 0.05 97.69 12:33:08 12:13:01 2 0.40 0.00 0.17 0.00 0.08 99.35 12:33:08 12:13:01 3 0.52 0.00 0.30 0.02 0.10 99.07 12:33:08 12:14:01 all 0.75 0.00 0.21 0.03 0.08 98.92 12:33:08 12:14:01 0 0.43 0.00 0.20 0.12 0.07 99.18 12:33:08 12:14:01 1 0.37 0.00 0.22 0.00 0.08 99.33 12:33:08 12:14:01 2 0.42 0.00 0.20 0.00 0.08 99.30 12:33:08 12:14:01 3 1.79 0.00 0.22 0.02 0.10 97.88 12:33:08 12:33:08 12:14:01 CPU %user %nice %system %iowait %steal %idle 12:33:08 12:15:02 all 0.58 0.00 0.27 0.01 0.08 99.06 12:33:08 12:15:02 0 0.42 0.00 0.22 0.03 0.08 99.25 12:33:08 12:15:02 1 0.67 0.00 0.37 0.00 0.08 98.88 12:33:08 12:15:02 2 0.90 0.00 0.33 0.00 0.08 98.68 12:33:08 12:15:02 3 0.34 0.00 0.17 0.00 0.07 99.43 12:33:08 12:16:01 all 55.37 0.00 1.79 1.10 0.10 41.64 12:33:08 12:16:01 0 56.29 0.00 1.83 0.42 0.08 41.37 12:33:08 12:16:01 1 56.88 0.00 1.96 3.02 0.12 38.03 12:33:08 12:16:01 2 53.22 0.00 1.70 0.22 0.10 44.75 12:33:08 12:16:01 3 55.08 0.00 1.65 0.73 0.10 42.43 12:33:08 12:17:01 all 10.07 0.00 0.43 0.11 0.09 89.30 12:33:08 12:17:01 0 10.48 0.00 0.48 0.12 0.08 88.83 12:33:08 12:17:01 1 10.50 0.00 0.45 0.24 0.08 88.73 12:33:08 12:17:01 2 9.50 0.00 0.35 0.10 0.10 89.95 12:33:08 12:17:01 3 9.80 0.00 0.42 0.00 0.08 89.69 12:33:08 12:18:01 all 23.81 0.00 0.93 0.08 0.09 75.10 12:33:08 12:18:01 0 24.72 0.00 1.14 0.20 0.08 73.85 12:33:08 12:18:01 1 22.51 0.00 0.95 0.07 0.10 76.37 12:33:08 12:18:01 2 22.82 0.00 0.82 0.03 0.08 76.23 12:33:08 12:18:01 3 25.18 0.00 0.79 0.02 0.08 73.93 12:33:08 12:19:01 all 33.86 0.00 1.03 0.26 0.10 64.75 12:33:08 12:19:01 0 33.96 0.00 1.25 0.00 0.10 64.68 12:33:08 12:19:01 1 35.07 0.00 0.58 0.60 0.10 63.64 12:33:08 12:19:01 2 31.09 0.00 0.75 0.02 0.10 68.04 12:33:08 12:19:01 3 35.32 0.00 1.52 0.42 0.10 62.65 12:33:08 12:20:01 all 2.94 0.00 0.27 0.01 0.08 96.70 12:33:08 12:20:01 0 2.93 0.00 0.20 0.02 0.07 96.79 12:33:08 12:20:01 1 2.97 0.00 0.40 0.02 0.08 96.53 12:33:08 12:20:01 2 2.74 0.00 0.13 0.00 0.07 97.06 12:33:08 12:20:01 3 3.14 0.00 0.33 0.00 0.10 96.43 12:33:08 12:21:01 all 3.46 0.00 0.31 0.01 0.08 96.14 12:33:08 12:21:01 0 3.32 0.00 0.33 0.00 0.08 96.27 12:33:08 12:21:01 1 3.49 0.00 0.30 0.03 0.08 96.09 12:33:08 12:21:01 2 3.52 0.00 0.30 0.00 0.08 96.10 12:33:08 12:21:01 3 3.52 0.00 0.30 0.00 0.08 96.10 12:33:08 12:22:01 all 2.00 0.00 0.28 0.02 0.08 97.63 12:33:08 12:22:01 0 1.98 0.00 0.28 0.00 0.07 97.67 12:33:08 12:22:01 1 2.27 0.00 0.38 0.07 0.08 97.20 12:33:08 12:22:01 2 1.91 0.00 0.28 0.02 0.08 97.70 12:33:08 12:22:01 3 1.84 0.00 0.15 0.00 0.07 97.95 12:33:08 12:23:01 all 2.76 0.00 0.28 0.02 0.08 96.85 12:33:08 12:23:01 0 2.66 0.00 0.35 0.02 0.10 96.88 12:33:08 12:23:01 1 2.49 0.00 0.25 0.03 0.10 97.13 12:33:08 12:23:01 2 2.86 0.00 0.27 0.02 0.07 96.79 12:33:08 12:23:01 3 3.05 0.00 0.25 0.02 0.07 96.62 12:33:08 12:24:01 all 2.51 0.00 0.25 0.01 0.10 97.13 12:33:08 12:24:01 0 2.23 0.00 0.35 0.00 0.10 97.32 12:33:08 12:24:01 1 2.27 0.00 0.17 0.02 0.08 97.46 12:33:08 12:24:01 2 1.74 0.00 0.27 0.02 0.10 97.88 12:33:08 12:24:01 3 3.78 0.00 0.23 0.00 0.10 95.89 12:33:08 12:25:01 all 15.81 0.00 0.74 0.02 0.09 83.34 12:33:08 12:25:01 0 15.79 0.00 0.85 0.02 0.10 83.25 12:33:08 12:25:01 1 14.50 0.00 0.74 0.07 0.08 84.62 12:33:08 12:25:01 2 15.46 0.00 0.77 0.02 0.08 83.68 12:33:08 12:25:01 3 17.48 0.00 0.61 0.00 0.08 81.82 12:33:08 12:33:08 12:25:01 CPU %user %nice %system %iowait %steal %idle 12:33:08 12:26:01 all 49.03 0.00 1.34 0.25 0.11 49.27 12:33:08 12:26:01 0 47.38 0.00 1.06 0.37 0.12 51.07 12:33:08 12:26:01 1 49.43 0.00 1.22 0.25 0.12 48.98 12:33:08 12:26:01 2 51.07 0.00 1.89 0.03 0.08 46.92 12:33:08 12:26:01 3 48.25 0.00 1.19 0.37 0.12 50.08 12:33:08 12:27:01 all 6.89 0.00 0.28 0.03 0.10 92.71 12:33:08 12:27:01 0 7.34 0.00 0.22 0.00 0.08 92.36 12:33:08 12:27:01 1 5.92 0.00 0.28 0.02 0.10 93.68 12:33:08 12:27:01 2 6.03 0.00 0.25 0.07 0.10 93.55 12:33:08 12:27:01 3 8.25 0.00 0.35 0.03 0.10 91.27 12:33:08 12:28:01 all 5.70 0.00 0.30 0.02 0.08 93.89 12:33:08 12:28:01 0 5.83 0.00 0.27 0.00 0.08 93.82 12:33:08 12:28:01 1 5.78 0.00 0.23 0.02 0.08 93.89 12:33:08 12:28:01 2 5.45 0.00 0.45 0.00 0.08 94.01 12:33:08 12:28:01 3 5.74 0.00 0.25 0.07 0.08 93.85 12:33:08 12:29:01 all 5.08 0.00 0.31 0.03 0.09 94.50 12:33:08 12:29:01 0 5.07 0.00 0.30 0.00 0.08 94.55 12:33:08 12:29:01 1 4.56 0.00 0.30 0.08 0.08 94.97 12:33:08 12:29:01 2 5.44 0.00 0.27 0.00 0.08 94.21 12:33:08 12:29:01 3 5.25 0.00 0.35 0.03 0.10 94.26 12:33:08 12:30:01 all 0.25 0.00 0.11 0.01 0.08 99.55 12:33:08 12:30:01 0 0.30 0.00 0.18 0.00 0.10 99.41 12:33:08 12:30:01 1 0.13 0.00 0.08 0.02 0.08 99.68 12:33:08 12:30:01 2 0.30 0.00 0.08 0.00 0.07 99.55 12:33:08 12:30:01 3 0.25 0.00 0.10 0.03 0.07 99.55 12:33:08 12:31:01 all 0.20 0.00 0.09 0.01 0.08 99.62 12:33:08 12:31:01 0 0.18 0.00 0.03 0.00 0.08 99.70 12:33:08 12:31:01 1 0.15 0.00 0.13 0.00 0.07 99.65 12:33:08 12:31:01 2 0.34 0.00 0.10 0.00 0.10 99.46 12:33:08 12:31:01 3 0.12 0.00 0.10 0.05 0.05 99.68 12:33:08 12:32:01 all 0.23 0.00 0.12 0.00 0.09 99.56 12:33:08 12:32:01 0 0.23 0.00 0.12 0.00 0.08 99.57 12:33:08 12:32:01 1 0.22 0.00 0.13 0.00 0.10 99.55 12:33:08 12:32:01 2 0.25 0.00 0.12 0.00 0.08 99.55 12:33:08 12:32:01 3 0.20 0.00 0.12 0.02 0.10 99.56 12:33:08 12:33:01 all 25.79 0.00 1.49 0.90 0.08 71.74 12:33:08 12:33:01 0 24.47 0.00 1.25 0.97 0.08 73.22 12:33:08 12:33:01 1 39.96 0.00 1.45 0.30 0.08 58.20 12:33:08 12:33:01 2 17.70 0.00 1.43 0.10 0.07 80.70 12:33:08 12:33:01 3 21.06 0.00 1.84 2.24 0.07 74.79 12:33:08 Average: all 20.44 0.09 0.92 0.35 0.09 78.10 12:33:08 Average: 0 20.48 0.09 0.95 0.32 0.09 78.08 12:33:08 Average: 1 20.24 0.10 0.96 0.50 0.10 78.10 12:33:08 Average: 2 20.31 0.09 0.89 0.29 0.09 78.33 12:33:08 Average: 3 20.75 0.10 0.90 0.28 0.09 77.89 12:33:08 12:33:08 12:33:08