Triggered by Gerrit: https://git.opendaylight.org/gerrit/c/transportpce/+/113593 Running as SYSTEM [EnvInject] - Loading node environment variables. Building remotely on prd-ubuntu2004-docker-4c-16g-25219 (ubuntu2004-docker-4c-16g) in workspace /w/workspace/transportpce-tox-verify-scandium [ssh-agent] Looking for ssh-agent implementation... [ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine) $ ssh-agent SSH_AUTH_SOCK=/tmp/ssh-1n1rpiVsDHnQ/agent.13085 SSH_AGENT_PID=13087 [ssh-agent] Started. Running ssh-add (command line suppressed) Identity added: /w/workspace/transportpce-tox-verify-scandium@tmp/private_key_4255522343319533075.key (/w/workspace/transportpce-tox-verify-scandium@tmp/private_key_4255522343319533075.key) [ssh-agent] Using credentials jenkins (jenkins-ssh) The recommended git tool is: NONE using credential jenkins-ssh Wiping out workspace first. Cloning the remote Git repository Cloning repository git://devvexx.opendaylight.org/mirror/transportpce > git init /w/workspace/transportpce-tox-verify-scandium # timeout=10 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/transportpce > git --version # timeout=10 > git --version # 'git version 2.25.1' using GIT_SSH to set credentials jenkins-ssh Verifying host key using known hosts file You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/transportpce +refs/heads/*:refs/remotes/origin/* # timeout=10 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/transportpce # timeout=10 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/transportpce # timeout=10 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/transportpce using GIT_SSH to set credentials jenkins-ssh Verifying host key using known hosts file You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/transportpce refs/changes/93/113593/5 # timeout=10 > git rev-parse b0f8bdd27c3c7c9df46db3d67db62520a94d19ef^{commit} # timeout=10 Checking out Revision b0f8bdd27c3c7c9df46db3d67db62520a94d19ef (refs/changes/93/113593/5) > git config core.sparsecheckout # timeout=10 > git checkout -f b0f8bdd27c3c7c9df46db3d67db62520a94d19ef # timeout=10 Commit message: "Bump netconf to 8.0.2" > git rev-parse FETCH_HEAD^{commit} # timeout=10 > git rev-list --no-walk d6d346d568042c620ef10f0c8d49618b7bbb83b6 # timeout=10 > git remote # timeout=10 > git submodule init # timeout=10 > git submodule sync # timeout=10 > git config --get remote.origin.url # timeout=10 > git submodule init # timeout=10 > git config -f .gitmodules --get-regexp ^submodule\.(.+)\.url # timeout=10 ERROR: No submodules found. provisioning config files... copy managed file [npmrc] to file:/home/jenkins/.npmrc copy managed file [pipconf] to file:/home/jenkins/.config/pip/pip.conf [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins6359823424129299859.sh ---> python-tools-install.sh Setup pyenv: * system (set by /opt/pyenv/version) * 3.8.13 (set by /opt/pyenv/version) * 3.9.13 (set by /opt/pyenv/version) * 3.10.13 (set by /opt/pyenv/version) * 3.11.7 (set by /opt/pyenv/version) lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-rCHz lf-activate-venv(): INFO: Save venv in file: /tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools lf-activate-venv(): INFO: Adding /tmp/venv-rCHz/bin to PATH Generating Requirements File Python 3.11.7 pip 24.2 from /tmp/venv-rCHz/lib/python3.11/site-packages/pip (python 3.11) appdirs==1.4.4 argcomplete==3.5.0 aspy.yaml==1.3.0 attrs==24.2.0 autopage==0.5.2 beautifulsoup4==4.12.3 boto3==1.35.24 botocore==1.35.24 bs4==0.0.2 cachetools==5.5.0 certifi==2024.8.30 cffi==1.17.1 cfgv==3.4.0 chardet==5.2.0 charset-normalizer==3.3.2 click==8.1.7 cliff==4.7.0 cmd2==2.4.3 cryptography==3.3.2 debtcollector==3.0.0 decorator==5.1.1 defusedxml==0.7.1 Deprecated==1.2.14 distlib==0.3.8 dnspython==2.6.1 docker==4.2.2 dogpile.cache==1.3.3 durationpy==0.7 email_validator==2.2.0 filelock==3.16.1 future==1.0.0 gitdb==4.0.11 GitPython==3.1.43 google-auth==2.35.0 httplib2==0.22.0 identify==2.6.1 idna==3.10 importlib-resources==1.5.0 iso8601==2.1.0 Jinja2==3.1.4 jmespath==1.0.1 jsonpatch==1.33 jsonpointer==3.0.0 jsonschema==4.23.0 jsonschema-specifications==2023.12.1 keystoneauth1==5.8.0 kubernetes==31.0.0 lftools==0.37.10 lxml==5.3.0 MarkupSafe==2.1.5 msgpack==1.1.0 multi_key_dict==2.0.3 munch==4.0.0 netaddr==1.3.0 netifaces==0.11.0 niet==1.4.2 nodeenv==1.9.1 oauth2client==4.1.3 oauthlib==3.2.2 openstacksdk==4.0.0 os-client-config==2.1.0 os-service-types==1.7.0 osc-lib==3.1.0 oslo.config==9.6.0 oslo.context==5.6.0 oslo.i18n==6.4.0 oslo.log==6.1.2 oslo.serialization==5.5.0 oslo.utils==7.3.0 packaging==24.1 pbr==6.1.0 platformdirs==4.3.6 prettytable==3.11.0 pyasn1==0.6.1 pyasn1_modules==0.4.1 pycparser==2.22 pygerrit2==2.0.15 PyGithub==2.4.0 PyJWT==2.9.0 PyNaCl==1.5.0 pyparsing==2.4.7 pyperclip==1.9.0 pyrsistent==0.20.0 python-cinderclient==9.6.0 python-dateutil==2.9.0.post0 python-heatclient==4.0.0 python-jenkins==1.8.2 python-keystoneclient==5.5.0 python-magnumclient==4.7.0 python-openstackclient==7.1.2 python-swiftclient==4.6.0 PyYAML==6.0.2 referencing==0.35.1 requests==2.32.3 requests-oauthlib==2.0.0 requestsexceptions==1.4.0 rfc3986==2.0.0 rpds-py==0.20.0 rsa==4.9 ruamel.yaml==0.18.6 ruamel.yaml.clib==0.2.8 s3transfer==0.10.2 simplejson==3.19.3 six==1.16.0 smmap==5.0.1 soupsieve==2.6 stevedore==5.3.0 tabulate==0.9.0 toml==0.10.2 tomlkit==0.13.2 tqdm==4.66.5 typing_extensions==4.12.2 tzdata==2024.1 urllib3==1.26.20 virtualenv==20.26.5 wcwidth==0.2.13 websocket-client==1.8.0 wrapt==1.16.0 xdg==6.0.0 xmltodict==0.13.0 yq==3.4.3 [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content PYTHON=python3 [EnvInject] - Variables injected successfully. [transportpce-tox-verify-scandium] $ /bin/bash -l /tmp/jenkins3102843901408866973.sh ---> tox-install.sh + source /home/jenkins/lf-env.sh + lf-activate-venv --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 ++ mktemp -d /tmp/venv-XXXX + lf_venv=/tmp/venv-OSkH + local venv_file=/tmp/.os_lf_venv + local python=python3 + local options + local set_path=true + local install_args= ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 + options=' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' + eval set -- ' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' ++ set -- --venv-file /tmp/.toxenv -- tox virtualenv urllib3~=1.26.15 + true + case $1 in + venv_file=/tmp/.toxenv + shift 2 + true + case $1 in + shift + break + case $python in + local pkg_list= + [[ -d /opt/pyenv ]] + echo 'Setup pyenv:' Setup pyenv: + export PYENV_ROOT=/opt/pyenv + PYENV_ROOT=/opt/pyenv + export PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + pyenv versions system 3.8.13 3.9.13 3.10.13 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-scandium/.python-version) + command -v pyenv ++ pyenv init - --no-rehash + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); for i in ${!paths[@]}; do if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; fi; done; echo "${paths[*]}"'\'')" export PATH="/opt/pyenv/shims:${PATH}" export PYENV_SHELL=bash source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' pyenv() { local command command="${1:-}" if [ "$#" -gt 0 ]; then shift fi case "$command" in rehash|shell) eval "$(pyenv "sh-$command" "$@")" ;; *) command pyenv "$command" "$@" ;; esac }' +++ bash --norc -ec 'IFS=:; paths=($PATH); for i in ${!paths[@]}; do if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; fi; done; echo "${paths[*]}"' ++ PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ export PYENV_SHELL=bash ++ PYENV_SHELL=bash ++ source /opt/pyenv/libexec/../completions/pyenv.bash +++ complete -F _pyenv pyenv ++ lf-pyver python3 ++ local py_version_xy=python3 ++ local py_version_xyz= ++ pyenv versions ++ local command ++ command=versions ++ '[' 1 -gt 0 ']' ++ shift ++ case "$command" in ++ command pyenv versions ++ pyenv versions ++ awk '{ print $1 }' ++ sed 's/^[ *]* //' ++ grep -E '^[0-9.]*[0-9]$' ++ [[ ! -s /tmp/.pyenv_versions ]] +++ grep '^3' /tmp/.pyenv_versions +++ sort -V +++ tail -n 1 ++ py_version_xyz=3.11.7 ++ [[ -z 3.11.7 ]] ++ echo 3.11.7 ++ return 0 + pyenv local 3.11.7 + local command + command=local + '[' 2 -gt 0 ']' + shift + case "$command" in + command pyenv local 3.11.7 + pyenv local 3.11.7 + for arg in "$@" + case $arg in + pkg_list+='tox ' + for arg in "$@" + case $arg in + pkg_list+='virtualenv ' + for arg in "$@" + case $arg in + pkg_list+='urllib3~=1.26.15 ' + [[ -f /tmp/.toxenv ]] + [[ ! -f /tmp/.toxenv ]] + [[ -n '' ]] + python3 -m venv /tmp/venv-OSkH + echo 'lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-OSkH' lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-OSkH + echo /tmp/venv-OSkH + echo 'lf-activate-venv(): INFO: Save venv in file: /tmp/.toxenv' lf-activate-venv(): INFO: Save venv in file: /tmp/.toxenv + /tmp/venv-OSkH/bin/python3 -m pip install --upgrade --quiet pip virtualenv + [[ -z tox virtualenv urllib3~=1.26.15 ]] + echo 'lf-activate-venv(): INFO: Installing: tox virtualenv urllib3~=1.26.15 ' lf-activate-venv(): INFO: Installing: tox virtualenv urllib3~=1.26.15 + /tmp/venv-OSkH/bin/python3 -m pip install --upgrade --quiet --upgrade-strategy eager tox virtualenv urllib3~=1.26.15 + type python3 + true + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-OSkH/bin to PATH' lf-activate-venv(): INFO: Adding /tmp/venv-OSkH/bin to PATH + PATH=/tmp/venv-OSkH/bin:/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + return 0 + python3 --version Python 3.11.7 + python3 -m pip --version pip 24.2 from /tmp/venv-OSkH/lib/python3.11/site-packages/pip (python 3.11) + python3 -m pip freeze cachetools==5.5.0 chardet==5.2.0 colorama==0.4.6 distlib==0.3.8 filelock==3.16.1 packaging==24.1 platformdirs==4.3.6 pluggy==1.5.0 pyproject-api==1.8.0 tox==4.20.0 urllib3==1.26.20 virtualenv==20.26.5 [transportpce-tox-verify-scandium] $ /bin/sh -xe /tmp/jenkins3983515608907909598.sh [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content PARALLEL=True [EnvInject] - Variables injected successfully. [transportpce-tox-verify-scandium] $ /bin/bash -l /tmp/jenkins11588276321198972868.sh ---> tox-run.sh + PATH=/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + ARCHIVE_TOX_DIR=/w/workspace/transportpce-tox-verify-scandium/archives/tox + ARCHIVE_DOC_DIR=/w/workspace/transportpce-tox-verify-scandium/archives/docs + mkdir -p /w/workspace/transportpce-tox-verify-scandium/archives/tox + cd /w/workspace/transportpce-tox-verify-scandium/. + source /home/jenkins/lf-env.sh + lf-activate-venv --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 ++ mktemp -d /tmp/venv-XXXX + lf_venv=/tmp/venv-BkNC + local venv_file=/tmp/.os_lf_venv + local python=python3 + local options + local set_path=true + local install_args= ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 + options=' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' + eval set -- ' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' ++ set -- --venv-file /tmp/.toxenv -- tox virtualenv urllib3~=1.26.15 + true + case $1 in + venv_file=/tmp/.toxenv + shift 2 + true + case $1 in + shift + break + case $python in + local pkg_list= + [[ -d /opt/pyenv ]] + echo 'Setup pyenv:' Setup pyenv: + export PYENV_ROOT=/opt/pyenv + PYENV_ROOT=/opt/pyenv + export PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + pyenv versions system 3.8.13 3.9.13 3.10.13 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-scandium/.python-version) + command -v pyenv ++ pyenv init - --no-rehash + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); for i in ${!paths[@]}; do if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; fi; done; echo "${paths[*]}"'\'')" export PATH="/opt/pyenv/shims:${PATH}" export PYENV_SHELL=bash source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' pyenv() { local command command="${1:-}" if [ "$#" -gt 0 ]; then shift fi case "$command" in rehash|shell) eval "$(pyenv "sh-$command" "$@")" ;; *) command pyenv "$command" "$@" ;; esac }' +++ bash --norc -ec 'IFS=:; paths=($PATH); for i in ${!paths[@]}; do if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; fi; done; echo "${paths[*]}"' ++ PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ export PYENV_SHELL=bash ++ PYENV_SHELL=bash ++ source /opt/pyenv/libexec/../completions/pyenv.bash +++ complete -F _pyenv pyenv ++ lf-pyver python3 ++ local py_version_xy=python3 ++ local py_version_xyz= ++ pyenv versions ++ local command ++ command=versions ++ sed 's/^[ *]* //' ++ '[' 1 -gt 0 ']' ++ shift ++ case "$command" in ++ command pyenv versions ++ pyenv versions ++ awk '{ print $1 }' ++ grep -E '^[0-9.]*[0-9]$' ++ [[ ! -s /tmp/.pyenv_versions ]] +++ grep '^3' /tmp/.pyenv_versions +++ sort -V +++ tail -n 1 ++ py_version_xyz=3.11.7 ++ [[ -z 3.11.7 ]] ++ echo 3.11.7 ++ return 0 + pyenv local 3.11.7 + local command + command=local + '[' 2 -gt 0 ']' + shift + case "$command" in + command pyenv local 3.11.7 + pyenv local 3.11.7 + for arg in "$@" + case $arg in + pkg_list+='tox ' + for arg in "$@" + case $arg in + pkg_list+='virtualenv ' + for arg in "$@" + case $arg in + pkg_list+='urllib3~=1.26.15 ' + [[ -f /tmp/.toxenv ]] ++ cat /tmp/.toxenv + lf_venv=/tmp/venv-OSkH + echo 'lf-activate-venv(): INFO: Reuse venv:/tmp/venv-OSkH from' file:/tmp/.toxenv lf-activate-venv(): INFO: Reuse venv:/tmp/venv-OSkH from file:/tmp/.toxenv + /tmp/venv-OSkH/bin/python3 -m pip install --upgrade --quiet pip virtualenv + [[ -z tox virtualenv urllib3~=1.26.15 ]] + echo 'lf-activate-venv(): INFO: Installing: tox virtualenv urllib3~=1.26.15 ' lf-activate-venv(): INFO: Installing: tox virtualenv urllib3~=1.26.15 + /tmp/venv-OSkH/bin/python3 -m pip install --upgrade --quiet --upgrade-strategy eager tox virtualenv urllib3~=1.26.15 + type python3 + true + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-OSkH/bin to PATH' lf-activate-venv(): INFO: Adding /tmp/venv-OSkH/bin to PATH + PATH=/tmp/venv-OSkH/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + return 0 + [[ -d /opt/pyenv ]] + echo '---> Setting up pyenv' ---> Setting up pyenv + export PYENV_ROOT=/opt/pyenv + PYENV_ROOT=/opt/pyenv + export PATH=/opt/pyenv/bin:/tmp/venv-OSkH/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + PATH=/opt/pyenv/bin:/tmp/venv-OSkH/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ pwd + PYTHONPATH=/w/workspace/transportpce-tox-verify-scandium + export PYTHONPATH + export TOX_TESTENV_PASSENV=PYTHONPATH + TOX_TESTENV_PASSENV=PYTHONPATH + tox --version 4.20.0 from /tmp/venv-OSkH/lib/python3.11/site-packages/tox/__init__.py + PARALLEL=True + TOX_OPTIONS_LIST= + [[ -n '' ]] + case ${PARALLEL,,} in + TOX_OPTIONS_LIST=' --parallel auto --parallel-live' + tox --parallel auto --parallel-live + tee -a /w/workspace/transportpce-tox-verify-scandium/archives/tox/tox.log checkbashisms: freeze> python -m pip freeze --all buildcontroller: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt docs: install_deps> python -I -m pip install -r docs/requirements.txt docs-linkcheck: install_deps> python -I -m pip install -r docs/requirements.txt checkbashisms: pip==24.2,setuptools==75.1.0,wheel==0.44.0 checkbashisms: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./fixCIcentOS8reposMirrors.sh checkbashisms: commands[1] /w/workspace/transportpce-tox-verify-scandium/tests> sh -c 'command checkbashisms>/dev/null || sudo yum install -y devscripts-checkbashisms || sudo yum install -y devscripts-minimal || sudo yum install -y devscripts || sudo yum install -y https://archives.fedoraproject.org/pub/archive/fedora/linux/releases/31/Everything/x86_64/os/Packages/d/devscripts-checkbashisms-2.19.6-2.fc31.x86_64.rpm || (echo "checkbashisms command not found - please install it (e.g. sudo apt-get install devscripts | yum install devscripts-minimal )" >&2 && exit 1)' checkbashisms: commands[2] /w/workspace/transportpce-tox-verify-scandium/tests> find . -not -path '*/\.*' -name '*.sh' -exec checkbashisms -f '{}' + script ./reflectwarn.sh does not appear to have a #! interpreter line; you may get strange results checkbashisms: OK ✔ in 2.73 seconds pre-commit: install_deps> python -I -m pip install pre-commit pre-commit: freeze> python -m pip freeze --all pre-commit: cfgv==3.4.0,distlib==0.3.8,filelock==3.16.1,identify==2.6.1,nodeenv==1.9.1,pip==24.2,platformdirs==4.3.6,pre-commit==3.8.0,PyYAML==6.0.2,setuptools==75.1.0,virtualenv==20.26.5,wheel==0.44.0 pre-commit: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./fixCIcentOS8reposMirrors.sh pre-commit: commands[1] /w/workspace/transportpce-tox-verify-scandium/tests> sh -c 'which cpan || sudo yum install -y perl-CPAN || (echo "cpan command not found - please install it (e.g. sudo apt-get install perl-modules | yum install perl-CPAN )" >&2 && exit 1)' /usr/bin/cpan pre-commit: commands[2] /w/workspace/transportpce-tox-verify-scandium/tests> pre-commit run --all-files --show-diff-on-failure [INFO] Initializing environment for https://github.com/pre-commit/pre-commit-hooks. [INFO] Initializing environment for https://github.com/jorisroovers/gitlint. [INFO] Initializing environment for https://github.com/jorisroovers/gitlint:./gitlint-core[trusted-deps]. buildcontroller: freeze> python -m pip freeze --all [INFO] Initializing environment for https://github.com/Lucas-C/pre-commit-hooks. buildcontroller: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 buildcontroller: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./build_controller.sh + update-java-alternatives -l java-1.11.0-openjdk-amd64 1111 /usr/lib/jvm/java-1.11.0-openjdk-amd64 java-1.12.0-openjdk-amd64 1211 /usr/lib/jvm/java-1.12.0-openjdk-amd64 java-1.17.0-openjdk-amd64 1711 /usr/lib/jvm/java-1.17.0-openjdk-amd64 java-1.21.0-openjdk-amd64 2111 /usr/lib/jvm/java-1.21.0-openjdk-amd64 java-1.8.0-openjdk-amd64 1081 /usr/lib/jvm/java-1.8.0-openjdk-amd64 + sudo update-java-alternatives -s java-1.21.0-openjdk-amd64 [INFO] Initializing environment for https://github.com/pre-commit/mirrors-autopep8. + + sed -n ;s/.* version "\(.*\)\.\(.*\)\..*".*$/\1/p; java -version [INFO] Initializing environment for https://github.com/perltidy/perltidy. + JAVA_VER=21 + echo 21 21 + sed -n ;s/javac \(.*\)\.\(.*\)\..*.*$/\1/p; + javac -version 21 ok, java is 21 or newer + JAVAC_VER=21 + echo 21 + [ 21 -ge 21 ] + [ 21 -ge 21 ] + echo ok, java is 21 or newer + wget -nv https://dlcdn.apache.org/maven/maven-3/3.9.8/binaries/apache-maven-3.9.8-bin.tar.gz -P /tmp 2024-09-21 07:58:13 URL:https://dlcdn.apache.org/maven/maven-3/3.9.8/binaries/apache-maven-3.9.8-bin.tar.gz [9083702/9083702] -> "/tmp/apache-maven-3.9.8-bin.tar.gz" [1] + sudo mkdir -p /opt + sudo tar xf /tmp/apache-maven-3.9.8-bin.tar.gz -C /opt + sudo ln -s /opt/apache-maven-3.9.8 /opt/maven + sudo ln -s /opt/maven/bin/mvn /usr/bin/mvn + mvn --version [INFO] Installing environment for https://github.com/pre-commit/pre-commit-hooks. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... Apache Maven 3.9.8 (36645f6c9b5079805ea5009217e36f2cffd34256) Maven home: /opt/maven Java version: 21.0.4, vendor: Ubuntu, runtime: /usr/lib/jvm/java-21-openjdk-amd64 Default locale: en, platform encoding: UTF-8 OS name: "linux", version: "5.4.0-190-generic", arch: "amd64", family: "unix" NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.nio.charset=ALL-UNNAMED --add-opens=java.base/java.nio.file=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.jar=ALL-UNNAMED --add-opens=java.base/java.util.stream=ALL-UNNAMED --add-opens=java.base/java.util.zip=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/sun.nio.fs=ALL-UNNAMED -Xlog:disable [INFO] Installing environment for https://github.com/Lucas-C/pre-commit-hooks. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... [INFO] Installing environment for https://github.com/pre-commit/mirrors-autopep8. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... docs-linkcheck: freeze> python -m pip freeze --all docs-linkcheck: alabaster==0.7.16,attrs==24.2.0,babel==2.16.0,blockdiag==3.0.0,certifi==2024.8.30,charset-normalizer==3.3.2,contourpy==1.3.0,cycler==0.12.1,docutils==0.20.1,fonttools==4.53.1,funcparserlib==2.0.0a0,future==1.0.0,idna==3.10,imagesize==1.4.1,Jinja2==3.1.4,jsonschema==3.2.0,kiwisolver==1.4.7,lfdocs-conf==0.9.0,MarkupSafe==2.1.5,matplotlib==3.9.2,numpy==2.1.1,nwdiag==3.0.0,packaging==24.1,pillow==10.4.0,pip==24.2,Pygments==2.18.0,pyparsing==3.1.4,pyrsistent==0.20.0,python-dateutil==2.9.0.post0,PyYAML==6.0.2,requests==2.32.3,requests-file==1.5.1,seqdiag==3.0.0,setuptools==75.1.0,six==1.16.0,snowballstemmer==2.2.0,Sphinx==7.4.7,sphinx-bootstrap-theme==0.8.1,sphinx-data-viewer==0.1.5,sphinx-rtd-theme==2.0.0,sphinx-tabs==3.4.5,sphinxcontrib-applehelp==2.0.0,sphinxcontrib-blockdiag==3.0.0,sphinxcontrib-devhelp==2.0.0,sphinxcontrib-htmlhelp==2.1.0,sphinxcontrib-jquery==4.1,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-needs==0.7.9,sphinxcontrib-nwdiag==2.0.0,sphinxcontrib-plantuml==0.30,sphinxcontrib-qthelp==2.0.0,sphinxcontrib-seqdiag==3.0.0,sphinxcontrib-serializinghtml==2.0.0,sphinxcontrib-swaggerdoc==0.1.7,urllib3==2.2.3,webcolors==24.8.0,wheel==0.44.0 docs-linkcheck: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> sphinx-build -q -b linkcheck -d /w/workspace/transportpce-tox-verify-scandium/.tox/docs-linkcheck/tmp/doctrees ../docs/ /w/workspace/transportpce-tox-verify-scandium/docs/_build/linkcheck [INFO] Installing environment for https://github.com/perltidy/perltidy. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... /w/workspace/transportpce-tox-verify-scandium/.tox/docs-linkcheck/lib/python3.11/site-packages/sphinx/builders/linkcheck.py:86: RemovedInSphinx80Warning: The default value for 'linkcheck_report_timeouts_as_broken' will change to False in Sphinx 8, meaning that request timeouts will be reported with a new 'timeout' status, instead of as 'broken'. This is intended to provide more detail as to the failure mode. See https://github.com/sphinx-doc/sphinx/issues/11868 for details. warnings.warn(deprecation_msg, RemovedInSphinx80Warning, stacklevel=1) docs: freeze> python -m pip freeze --all docs: alabaster==0.7.16,attrs==24.2.0,babel==2.16.0,blockdiag==3.0.0,certifi==2024.8.30,charset-normalizer==3.3.2,contourpy==1.3.0,cycler==0.12.1,docutils==0.20.1,fonttools==4.53.1,funcparserlib==2.0.0a0,future==1.0.0,idna==3.10,imagesize==1.4.1,Jinja2==3.1.4,jsonschema==3.2.0,kiwisolver==1.4.7,lfdocs-conf==0.9.0,MarkupSafe==2.1.5,matplotlib==3.9.2,numpy==2.1.1,nwdiag==3.0.0,packaging==24.1,pillow==10.4.0,pip==24.2,Pygments==2.18.0,pyparsing==3.1.4,pyrsistent==0.20.0,python-dateutil==2.9.0.post0,PyYAML==6.0.2,requests==2.32.3,requests-file==1.5.1,seqdiag==3.0.0,setuptools==75.1.0,six==1.16.0,snowballstemmer==2.2.0,Sphinx==7.4.7,sphinx-bootstrap-theme==0.8.1,sphinx-data-viewer==0.1.5,sphinx-rtd-theme==2.0.0,sphinx-tabs==3.4.5,sphinxcontrib-applehelp==2.0.0,sphinxcontrib-blockdiag==3.0.0,sphinxcontrib-devhelp==2.0.0,sphinxcontrib-htmlhelp==2.1.0,sphinxcontrib-jquery==4.1,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-needs==0.7.9,sphinxcontrib-nwdiag==2.0.0,sphinxcontrib-plantuml==0.30,sphinxcontrib-qthelp==2.0.0,sphinxcontrib-seqdiag==3.0.0,sphinxcontrib-serializinghtml==2.0.0,sphinxcontrib-swaggerdoc==0.1.7,urllib3==2.2.3,webcolors==24.8.0,wheel==0.44.0 docs: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> sphinx-build -q -W --keep-going -b html -n -d /w/workspace/transportpce-tox-verify-scandium/.tox/docs/tmp/doctrees ../docs/ /w/workspace/transportpce-tox-verify-scandium/docs/_build/html docs: OK ✔ in 31.4 seconds pylint: install_deps> python -I -m pip install 'pylint>=2.6.0' docs-linkcheck: OK ✔ in 32.45 seconds pylint: freeze> python -m pip freeze --all pylint: astroid==3.3.3,dill==0.3.8,isort==5.13.2,mccabe==0.7.0,pip==24.2,platformdirs==4.3.6,pylint==3.3.0,setuptools==75.1.0,tomlkit==0.13.2,wheel==0.44.0 pylint: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> find transportpce_tests/ -name '*.py' -exec pylint --fail-under=10 --max-line-length=120 --disable=missing-docstring,import-error --disable=fixme --disable=duplicate-code '--module-rgx=([a-z0-9_]+$)|([0-9.]{1,30}$)' '--method-rgx=(([a-z_][a-zA-Z0-9_]{2,})|(_[a-z0-9_]*)|(__[a-zA-Z][a-zA-Z0-9_]+__))$' '--variable-rgx=[a-zA-Z_][a-zA-Z0-9_]{1,30}$' '{}' + trim trailing whitespace.................................................Passed Tabs remover.............................................................Passed autopep8.................................................................Passed perltidy.................................................................Passed pre-commit: commands[3] /w/workspace/transportpce-tox-verify-scandium/tests> pre-commit run gitlint-ci --hook-stage manual [INFO] Installing environment for https://github.com/jorisroovers/gitlint. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... ------------------------------------ Your code has been rated at 10.00/10 gitlint..................................................................Passed pylint: OK ✔ in 27.78 seconds pre-commit: OK ✔ in 56.77 seconds buildcontroller: OK ✔ in 1 minute 48.8 seconds testsPCE: install_deps> python -I -m pip install gnpy4tpce==2.4.7 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt build_karaf_tests121: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt sims: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt build_karaf_tests221: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt build_karaf_tests121: freeze> python -m pip freeze --all build_karaf_tests221: freeze> python -m pip freeze --all sims: freeze> python -m pip freeze --all build_karaf_tests121: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 build_karaf_tests121: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./build_karaf_for_tests.sh NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.nio.charset=ALL-UNNAMED --add-opens=java.base/java.nio.file=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.jar=ALL-UNNAMED --add-opens=java.base/java.util.stream=ALL-UNNAMED --add-opens=java.base/java.util.zip=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/sun.nio.fs=ALL-UNNAMED -Xlog:disable build_karaf_tests221: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 build_karaf_tests221: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./build_karaf_for_tests.sh NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.nio.charset=ALL-UNNAMED --add-opens=java.base/java.nio.file=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.jar=ALL-UNNAMED --add-opens=java.base/java.util.stream=ALL-UNNAMED --add-opens=java.base/java.util.zip=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/sun.nio.fs=ALL-UNNAMED -Xlog:disable sims: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 sims: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./install_lightynode.sh Using lighynode version 20.1.0.2 Installing lightynode device to ./lightynode/lightynode-openroadm-device directory sims: OK ✔ in 11.71 seconds build_karaf_tests71: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt build_karaf_tests71: freeze> python -m pip freeze --all build_karaf_tests71: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 build_karaf_tests71: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./build_karaf_for_tests.sh NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.nio.charset=ALL-UNNAMED --add-opens=java.base/java.nio.file=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.jar=ALL-UNNAMED --add-opens=java.base/java.util.stream=ALL-UNNAMED --add-opens=java.base/java.util.zip=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/sun.nio.fs=ALL-UNNAMED -Xlog:disable build_karaf_tests221: OK ✔ in 1 minute 24.07 seconds build_karaf_tests121: OK ✔ in 1 minute 24.26 seconds tests_tapi: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt build_karaf_tests_hybrid: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt build_karaf_tests_hybrid: freeze> python -m pip freeze --all tests_tapi: freeze> python -m pip freeze --all build_karaf_tests_hybrid: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 build_karaf_tests_hybrid: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./build_karaf_for_tests.sh NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.nio.charset=ALL-UNNAMED --add-opens=java.base/java.nio.file=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.jar=ALL-UNNAMED --add-opens=java.base/java.util.stream=ALL-UNNAMED --add-opens=java.base/java.util.zip=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/sun.nio.fs=ALL-UNNAMED -Xlog:disable tests_tapi: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 tests_tapi: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh tapi using environment variables from ./karaf221.env pytest -q transportpce_tests/tapi/test01_abstracted_topology.py build_karaf_tests71: OK ✔ in 1 minute 59.59 seconds testsPCE: freeze> python -m pip freeze --all testsPCE: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,click==8.1.7,contourpy==1.3.0,cryptography==3.3.2,cycler==0.12.1,dict2xml==1.7.6,Flask==2.1.3,Flask-Injector==0.14.0,fonttools==4.53.1,gnpy4tpce==2.4.7,idna==3.10,iniconfig==2.0.0,injector==0.22.0,itsdangerous==2.2.0,Jinja2==3.1.4,kiwisolver==1.4.7,lxml==5.3.0,MarkupSafe==2.1.5,matplotlib==3.9.2,netconf-client==3.1.1,networkx==2.8.8,numpy==1.26.4,packaging==24.1,pandas==1.5.3,paramiko==3.5.0,pbr==5.11.1,pillow==10.4.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pyparsing==3.1.4,pytest==8.3.3,python-dateutil==2.9.0.post0,pytz==2024.2,requests==2.32.3,scipy==1.14.1,setuptools==50.3.2,six==1.16.0,urllib3==2.2.3,Werkzeug==2.0.3,wheel==0.44.0,xlrd==1.2.0 testsPCE: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh pce pytest -q transportpce_tests/pce/test01_pce.py .................................................. [100%] 20 passed in 117.89s (0:01:57) pytest -q transportpce_tests/pce/test02_pce_400G.py ................ [100%] 9 passed in 38.22s pytest -q transportpce_tests/pce/test03_gnpy.py ................. [100%] 8 passed in 37.61s pytest -q transportpce_tests/pce/test04_pce_bug_fix.py ... [100%] 3 passed in 35.49s build_karaf_tests_hybrid: OK ✔ in 1 minute 5.02 seconds testsPCE: OK ✔ in 6 minutes 13.66 seconds tests121: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt tests121: freeze> python -m pip freeze --all tests121: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 tests121: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh 1.2.1 using environment variables from ./karaf121.env pytest -q transportpce_tests/1.2.1/test01_portmapping.py .... [100%] 50 passed in 410.91s (0:06:50) pytest -q transportpce_tests/tapi/test02_full_topology.py ....F.FFFF.......F................... [100%] =================================== FAILURES =================================== _____________ TransportPCEtesting.test_12_check_openroadm_topology _____________ self = def test_12_check_openroadm_topology(self): response = test_utils.get_ietf_network_request('openroadm-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) > self.assertEqual(len(response['network'][0]['node']), 13, 'There should be 13 openroadm nodes') E AssertionError: 14 != 13 : There should be 13 openroadm nodes transportpce_tests/tapi/test02_full_topology.py:272: AssertionError =========================== short test summary info ============================ FAILED transportpce_tests/tapi/test02_full_topology.py::TransportPCEtesting::test_12_check_openroadm_topology 1 failed, 29 passed in 266.23s (0:04:26) tests_tapi: exit 1 (677.60 seconds) /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh tapi pid=30863 tests_tapi: FAIL ✖ in 11 minutes 37.48 seconds tests71: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt .Ftests71: freeze> python -m pip freeze --all tests71: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 tests71: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh 7.1 using environment variables from ./karaf71.env pytest -q transportpce_tests/7.1/test01_portmapping.py FFFFFFFFFFFF [100%] =================================== FAILURES =================================== _________ TransportPCEPortMappingTesting.test_02_rdm_device_connected __________ self = def test_02_rdm_device_connected(self): response = test_utils.check_device_connection("ROADMA01") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/1.2.1/test01_portmapping.py:54: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_02_rdm_device_connected _________ TransportPCEPortMappingTesting.test_03_rdm_portmapping_info __________ self = def test_03_rdm_portmapping_info(self): response = test_utils.get_portmapping_node_attr("ROADMA01", "node-info", None) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/1.2.1/test01_portmapping.py:60: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_03_rdm_portmapping_info _____ TransportPCEPortMappingTesting.test_04_rdm_portmapping_DEG1_TTP_TXRX _____ self = def test_04_rdm_portmapping_DEG1_TTP_TXRX(self): response = test_utils.get_portmapping_node_attr("ROADMA01", "mapping", "DEG1-TTP-TXRX") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/1.2.1/test01_portmapping.py:73: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_04_rdm_portmapping_DEG1_TTP_TXRX _____ TransportPCEPortMappingTesting.test_05_rdm_portmapping_SRG1_PP7_TXRX _____ self = def test_05_rdm_portmapping_SRG1_PP7_TXRX(self): response = test_utils.get_portmapping_node_attr("ROADMA01", "mapping", "SRG1-PP7-TXRX") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/1.2.1/test01_portmapping.py:82: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_05_rdm_portmapping_SRG1_PP7_TXRX _____ TransportPCEPortMappingTesting.test_06_rdm_portmapping_SRG3_PP1_TXRX _____ self = def test_06_rdm_portmapping_SRG3_PP1_TXRX(self): response = test_utils.get_portmapping_node_attr("ROADMA01", "mapping", "SRG3-PP1-TXRX") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/1.2.1/test01_portmapping.py:91: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_06_rdm_portmapping_SRG3_PP1_TXRX _________ TransportPCEPortMappingTesting.test_09_xpdr_portmapping_info _________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_09_xpdr_portmapping_info(self): > response = test_utils.get_portmapping_node_attr("XPDRA01", "node-info", None) transportpce_tests/1.2.1/test01_portmapping.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:473: in get_portmapping_node_attr response = get_request(target_url) transportpce_tests/common/test_utils.py:116: in get_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_09_xpdr_portmapping_info _______ TransportPCEPortMappingTesting.test_10_xpdr_portmapping_NETWORK1 _______ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK1' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK1', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK1' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK1 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_10_xpdr_portmapping_NETWORK1(self): > response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-NETWORK1") transportpce_tests/1.2.1/test01_portmapping.py:122: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:473: in get_portmapping_node_attr response = get_request(target_url) transportpce_tests/common/test_utils.py:116: in get_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK1 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_10_xpdr_portmapping_NETWORK1 _______ TransportPCEPortMappingTesting.test_11_xpdr_portmapping_NETWORK2 _______ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK2' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK2', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK2' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK2 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_11_xpdr_portmapping_NETWORK2(self): > response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-NETWORK2") transportpce_tests/1.2.1/test01_portmapping.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:473: in get_portmapping_node_attr response = get_request(target_url) transportpce_tests/common/test_utils.py:116: in get_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK2 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_11_xpdr_portmapping_NETWORK2 _______ TransportPCEPortMappingTesting.test_12_xpdr_portmapping_CLIENT1 ________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT1' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT1', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT1' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT1 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_12_xpdr_portmapping_CLIENT1(self): > response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-CLIENT1") transportpce_tests/1.2.1/test01_portmapping.py:144: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:473: in get_portmapping_node_attr response = get_request(target_url) transportpce_tests/common/test_utils.py:116: in get_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT1 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_12_xpdr_portmapping_CLIENT1 _______ TransportPCEPortMappingTesting.test_13_xpdr_portmapping_CLIENT2 ________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT2' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT2', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT2' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT2 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_13_xpdr_portmapping_CLIENT2(self): > response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-CLIENT2") transportpce_tests/1.2.1/test01_portmapping.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:473: in get_portmapping_node_attr response = get_request(target_url) transportpce_tests/common/test_utils.py:116: in get_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT2 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_13_xpdr_portmapping_CLIENT2 _______ TransportPCEPortMappingTesting.test_14_xpdr_portmapping_CLIENT3 ________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT3' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT3', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT3' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT3 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_14_xpdr_portmapping_CLIENT3(self): > response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-CLIENT3") transportpce_tests/1.2.1/test01_portmapping.py:168: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:473: in get_portmapping_node_attr response = get_request(target_url) transportpce_tests/common/test_utils.py:116: in get_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT3 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_14_xpdr_portmapping_CLIENT3 _______ TransportPCEPortMappingTesting.test_15_xpdr_portmapping_CLIENT4 ________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT4' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT4', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT4' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT4 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_15_xpdr_portmapping_CLIENT4(self): > response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-CLIENT4") transportpce_tests/1.2.1/test01_portmapping.py:180: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:473: in get_portmapping_node_attr response = get_request(target_url) transportpce_tests/common/test_utils.py:116: in get_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT4 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_15_xpdr_portmapping_CLIENT4 _______ TransportPCEPortMappingTesting.test_16_xpdr_device_disconnection _______ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'DELETE' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '0', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'DELETE' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_16_xpdr_device_disconnection(self): > response = test_utils.unmount_device("XPDRA01") transportpce_tests/1.2.1/test01_portmapping.py:191: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:360: in unmount_device response = delete_request(url[RESTCONF_VERSION].format('{}', node)) transportpce_tests/common/test_utils.py:133: in delete_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_16_xpdr_device_disconnection _______ TransportPCEPortMappingTesting.test_17_xpdr_device_disconnected ________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'GET' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01?content=nonconfig' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01', query='content=nonconfig', fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01?content=nonconfig' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_17_xpdr_device_disconnected(self): > response = test_utils.check_device_connection("XPDRA01") transportpce_tests/1.2.1/test01_portmapping.py:195: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:371: in check_device_connection response = get_request(url[RESTCONF_VERSION].format('{}', node)) transportpce_tests/common/test_utils.py:116: in get_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_17_xpdr_device_disconnected _______ TransportPCEPortMappingTesting.test_18_xpdr_device_not_connected _______ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_18_xpdr_device_not_connected(self): > response = test_utils.get_portmapping_node_attr("XPDRA01", "node-info", None) transportpce_tests/1.2.1/test01_portmapping.py:203: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:473: in get_portmapping_node_attr response = get_request(target_url) transportpce_tests/common/test_utils.py:116: in get_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_18_xpdr_device_not_connected _______ TransportPCEPortMappingTesting.test_19_rdm_device_disconnection ________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'DELETE' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '0', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'DELETE' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_19_rdm_device_disconnection(self): > response = test_utils.unmount_device("ROADMA01") transportpce_tests/1.2.1/test01_portmapping.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:360: in unmount_device response = delete_request(url[RESTCONF_VERSION].format('{}', node)) transportpce_tests/common/test_utils.py:133: in delete_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_19_rdm_device_disconnection ________ TransportPCEPortMappingTesting.test_20_rdm_device_disconnected ________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'GET' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01?content=nonconfig' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01', query='content=nonconfig', fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01?content=nonconfig' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_20_rdm_device_disconnected(self): > response = test_utils.check_device_connection("ROADMA01") transportpce_tests/1.2.1/test01_portmapping.py:215: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:371: in check_device_connection response = get_request(url[RESTCONF_VERSION].format('{}', node)) transportpce_tests/common/test_utils.py:116: in get_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_20_rdm_device_disconnected _______ TransportPCEPortMappingTesting.test_21_rdm_device_not_connected ________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=ROADMA01/node-info' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=ROADMA01/node-info', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=ROADMA01/node-info' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=ROADMA01/node-info (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_21_rdm_device_not_connected(self): > response = test_utils.get_portmapping_node_attr("ROADMA01", "node-info", None) transportpce_tests/1.2.1/test01_portmapping.py:223: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:473: in get_portmapping_node_attr response = get_request(target_url) transportpce_tests/common/test_utils.py:116: in get_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=ROADMA01/node-info (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_21_rdm_device_not_connected --------------------------- Captured stdout teardown --------------------------- all processes killed =========================== short test summary info ============================ FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_02_rdm_device_connected FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_03_rdm_portmapping_info FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_04_rdm_portmapping_DEG1_TTP_TXRX FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_05_rdm_portmapping_SRG1_PP7_TXRX FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_06_rdm_portmapping_SRG3_PP1_TXRX FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_09_xpdr_portmapping_info FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_10_xpdr_portmapping_NETWORK1 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_11_xpdr_portmapping_NETWORK2 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_12_xpdr_portmapping_CLIENT1 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_13_xpdr_portmapping_CLIENT2 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_14_xpdr_portmapping_CLIENT3 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_15_xpdr_portmapping_CLIENT4 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_16_xpdr_device_disconnection FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_17_xpdr_device_disconnected FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_18_xpdr_device_not_connected FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_19_rdm_device_disconnection FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_20_rdm_device_disconnected FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_21_rdm_device_not_connected 18 failed, 3 passed in 424.89s (0:07:04) tests121: exit 1 (425.35 seconds) /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh 1.2.1 pid=35457 ............ [100%] 12 passed in 42.50s pytest -q transportpce_tests/7.1/test02_otn_renderer.py .............................................................. [100%] 62 passed in 154.61s (0:02:34) pytest -q transportpce_tests/7.1/test03_renderer_or_modes.py ................................................ [100%] 48 passed in 133.63s (0:02:13) pytest -q transportpce_tests/7.1/test04_renderer_regen_mode.py ...................... [100%] 22 passed in 72.06s (0:01:12) tests121: FAIL ✖ in 7 minutes 11.22 seconds tests71: OK ✔ in 6 minutes 50.02 seconds tests221: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt tests221: freeze> python -m pip freeze --all tests221: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 tests221: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh 2.2.1 using environment variables from ./karaf221.env pytest -q transportpce_tests/2.2.1/test01_portmapping.py ................................... [100%] 35 passed in 255.90s (0:04:15) pytest -q transportpce_tests/2.2.1/test02_topo_portmapping.py ...... [100%] 6 passed in 44.22s pytest -q transportpce_tests/2.2.1/test03_topology.py ............................................ [100%] 44 passed in 135.71s (0:02:15) pytest -q transportpce_tests/2.2.1/test04_otn_topology.py ............ [100%] 12 passed in 58.92s pytest -q transportpce_tests/2.2.1/test05_flex_grid.py ................ [100%] 16 passed in 113.47s (0:01:53) pytest -q transportpce_tests/2.2.1/test06_renderer_service_path_nominal.py ............................... [100%] 31 passed in 215.51s (0:03:35) pytest -q transportpce_tests/2.2.1/test07_otn_renderer.py .......................... [100%] 26 passed in 90.67s (0:01:30) pytest -q transportpce_tests/2.2.1/test08_otn_sh_renderer.py ...................... [100%] 22 passed in 99.06s (0:01:39) pytest -q transportpce_tests/2.2.1/test09_olm.py ........................................ [100%] 40 passed in 182.60s (0:03:02) pytest -q transportpce_tests/2.2.1/test11_otn_end2end.py ........................................................................ [ 74%] ......................... [100%] 97 passed in 490.00s (0:08:09) pytest -q transportpce_tests/2.2.1/test12_end2end.py ...................................................... [100%] 54 passed in 447.10s (0:07:27) pytest -q transportpce_tests/2.2.1/test14_otn_switch_end2end.py ........................................................................ [ 71%] ............................. [100%] 101 passed in 491.83s (0:08:11) pytest -q transportpce_tests/2.2.1/test15_otn_end2end_with_intermediate_switch.py ........................................................................ [ 67%] ................................... [100%] 107 passed in 599.84s (0:09:59) tests221: OK ✔ in 53 minutes 53.93 seconds tests_hybrid: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt tests_hybrid: freeze> python -m pip freeze --all tests_hybrid: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 tests_hybrid: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh hybrid using environment variables from ./karaf121.env pytest -q transportpce_tests/hybrid/test01_device_change_notifications.py ................................................... [100%] 51 passed in 150.57s (0:02:30) pytest -q transportpce_tests/hybrid/test02_B100G_end2end.py ........................................................................ [ 66%] ..................................... [100%] 109 passed in 607.09s (0:10:07) pytest -q transportpce_tests/hybrid/test03_autonomous_reroute.py ..................................................... [100%] 53 passed in 438.19s (0:07:18) tests_hybrid: OK ✔ in 20 minutes 2.84 seconds buildlighty: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt buildlighty: freeze> python -m pip freeze --all buildlighty: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 buildlighty: commands[0] /w/workspace/transportpce-tox-verify-scandium/lighty> ./build.sh NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED [ERROR] COMPILATION ERROR : [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[17,42] cannot find symbol symbol: class YangModuleInfo location: package org.opendaylight.yangtools.binding [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[21,30] cannot find symbol symbol: class YangModuleInfo location: class io.lighty.controllers.tpce.utils.TPCEUtils [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[343,30] cannot find symbol symbol: class YangModuleInfo location: class io.lighty.controllers.tpce.utils.TPCEUtils [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[350,23] cannot find symbol symbol: class YangModuleInfo location: class io.lighty.controllers.tpce.utils.TPCEUtils [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.13.0:compile (default-compile) on project tpce: Compilation failure: Compilation failure: [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[17,42] cannot find symbol [ERROR] symbol: class YangModuleInfo [ERROR] location: package org.opendaylight.yangtools.binding [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[21,30] cannot find symbol [ERROR] symbol: class YangModuleInfo [ERROR] location: class io.lighty.controllers.tpce.utils.TPCEUtils [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[343,30] cannot find symbol [ERROR] symbol: class YangModuleInfo [ERROR] location: class io.lighty.controllers.tpce.utils.TPCEUtils [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[350,23] cannot find symbol [ERROR] symbol: class YangModuleInfo [ERROR] location: class io.lighty.controllers.tpce.utils.TPCEUtils [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException unzip: cannot find or open target/tpce-bin.zip, target/tpce-bin.zip.zip or target/tpce-bin.zip.ZIP. buildlighty: exit 9 (15.81 seconds) /w/workspace/transportpce-tox-verify-scandium/lighty> ./build.sh pid=60165 buildlighty: command failed but is marked ignore outcome so handling it as success buildcontroller: OK (108.80=setup[7.06]+cmd[101.73] seconds) testsPCE: OK (373.66=setup[143.45]+cmd[230.21] seconds) sims: OK (11.71=setup[7.03]+cmd[4.68] seconds) build_karaf_tests121: OK (84.26=setup[6.96]+cmd[77.30] seconds) tests121: FAIL code 1 (431.22=setup[5.87]+cmd[425.35] seconds) build_karaf_tests221: OK (84.07=setup[6.99]+cmd[77.08] seconds) tests_tapi: FAIL code 1 (697.48=setup[19.89]+cmd[677.60] seconds) tests221: OK (3233.93=setup[6.15]+cmd[3227.78] seconds) build_karaf_tests71: OK (119.59=setup[13.60]+cmd[105.99] seconds) tests71: OK (410.02=setup[6.30]+cmd[403.72] seconds) build_karaf_tests_hybrid: OK (65.02=setup[20.00]+cmd[45.01] seconds) tests_hybrid: OK (1202.84=setup[6.30]+cmd[1196.53] seconds) buildlighty: OK (21.51=setup[5.70]+cmd[15.81] seconds) docs: OK (31.40=setup[27.77]+cmd[3.63] seconds) docs-linkcheck: OK (32.45=setup[26.74]+cmd[5.71] seconds) checkbashisms: OK (2.73=setup[1.82]+cmd[0.02,0.05,0.84] seconds) pre-commit: OK (56.76=setup[3.02]+cmd[0.00,0.01,40.60,13.14] seconds) pylint: OK (27.78=setup[5.07]+cmd[22.71] seconds) evaluation failed :( (5758.91 seconds) + tox_status=255 + echo '---> Completed tox runs' ---> Completed tox runs + for i in .tox/*/log ++ echo .tox/build_karaf_tests121/log ++ awk -F/ '{print $2}' + tox_env=build_karaf_tests121 + cp -r .tox/build_karaf_tests121/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/build_karaf_tests121 + for i in .tox/*/log ++ echo .tox/build_karaf_tests221/log ++ awk -F/ '{print $2}' + tox_env=build_karaf_tests221 + cp -r .tox/build_karaf_tests221/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/build_karaf_tests221 + for i in .tox/*/log ++ echo .tox/build_karaf_tests71/log ++ awk -F/ '{print $2}' + tox_env=build_karaf_tests71 + cp -r .tox/build_karaf_tests71/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/build_karaf_tests71 + for i in .tox/*/log ++ echo .tox/build_karaf_tests_hybrid/log ++ awk -F/ '{print $2}' + tox_env=build_karaf_tests_hybrid + cp -r .tox/build_karaf_tests_hybrid/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/build_karaf_tests_hybrid + for i in .tox/*/log ++ echo .tox/buildcontroller/log ++ awk -F/ '{print $2}' + tox_env=buildcontroller + cp -r .tox/buildcontroller/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/buildcontroller + for i in .tox/*/log ++ echo .tox/buildlighty/log ++ awk -F/ '{print $2}' + tox_env=buildlighty + cp -r .tox/buildlighty/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/buildlighty + for i in .tox/*/log ++ echo .tox/checkbashisms/log ++ awk -F/ '{print $2}' + tox_env=checkbashisms + cp -r .tox/checkbashisms/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/checkbashisms + for i in .tox/*/log ++ echo .tox/docs-linkcheck/log ++ awk -F/ '{print $2}' + tox_env=docs-linkcheck + cp -r .tox/docs-linkcheck/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/docs-linkcheck + for i in .tox/*/log ++ echo .tox/docs/log ++ awk -F/ '{print $2}' + tox_env=docs + cp -r .tox/docs/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/docs + for i in .tox/*/log ++ echo .tox/pre-commit/log ++ awk -F/ '{print $2}' + tox_env=pre-commit + cp -r .tox/pre-commit/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/pre-commit + for i in .tox/*/log ++ echo .tox/pylint/log ++ awk -F/ '{print $2}' + tox_env=pylint + cp -r .tox/pylint/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/pylint + for i in .tox/*/log ++ echo .tox/sims/log ++ awk -F/ '{print $2}' + tox_env=sims + cp -r .tox/sims/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/sims + for i in .tox/*/log ++ echo .tox/tests121/log ++ awk -F/ '{print $2}' + tox_env=tests121 + cp -r .tox/tests121/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/tests121 + for i in .tox/*/log ++ echo .tox/tests221/log ++ awk -F/ '{print $2}' + tox_env=tests221 + cp -r .tox/tests221/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/tests221 + for i in .tox/*/log ++ echo .tox/tests71/log ++ awk -F/ '{print $2}' + tox_env=tests71 + cp -r .tox/tests71/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/tests71 + for i in .tox/*/log ++ echo .tox/testsPCE/log ++ awk -F/ '{print $2}' + tox_env=testsPCE + cp -r .tox/testsPCE/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/testsPCE + for i in .tox/*/log ++ echo .tox/tests_hybrid/log ++ awk -F/ '{print $2}' + tox_env=tests_hybrid + cp -r .tox/tests_hybrid/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/tests_hybrid + for i in .tox/*/log ++ echo .tox/tests_tapi/log ++ awk -F/ '{print $2}' + tox_env=tests_tapi + cp -r .tox/tests_tapi/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/tests_tapi + DOC_DIR=docs/_build/html + [[ -d docs/_build/html ]] + echo '---> Archiving generated docs' ---> Archiving generated docs + mv docs/_build/html /w/workspace/transportpce-tox-verify-scandium/archives/docs + echo '---> tox-run.sh ends' ---> tox-run.sh ends + test 255 -eq 0 + exit 255 ++ '[' 1 = 1 ']' ++ '[' -x /usr/bin/clear_console ']' ++ /usr/bin/clear_console -q Build step 'Execute shell' marked build as failure $ ssh-agent -k unset SSH_AUTH_SOCK; unset SSH_AGENT_PID; echo Agent pid 13087 killed; [ssh-agent] Stopped. [PostBuildScript] - [INFO] Executing post build scripts. [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins16583790862032634528.sh ---> sysstat.sh [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins16314913148905788210.sh ---> package-listing.sh ++ facter osfamily ++ tr '[:upper:]' '[:lower:]' + OS_FAMILY=debian + workspace=/w/workspace/transportpce-tox-verify-scandium + START_PACKAGES=/tmp/packages_start.txt + END_PACKAGES=/tmp/packages_end.txt + DIFF_PACKAGES=/tmp/packages_diff.txt + PACKAGES=/tmp/packages_start.txt + '[' /w/workspace/transportpce-tox-verify-scandium ']' + PACKAGES=/tmp/packages_end.txt + case "${OS_FAMILY}" in + grep '^ii' + dpkg -l + '[' -f /tmp/packages_start.txt ']' + '[' -f /tmp/packages_end.txt ']' + diff /tmp/packages_start.txt /tmp/packages_end.txt + '[' /w/workspace/transportpce-tox-verify-scandium ']' + mkdir -p /w/workspace/transportpce-tox-verify-scandium/archives/ + cp -f /tmp/packages_diff.txt /tmp/packages_end.txt /tmp/packages_start.txt /w/workspace/transportpce-tox-verify-scandium/archives/ [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins1412589307539116459.sh ---> capture-instance-metadata.sh Setup pyenv: system 3.8.13 3.9.13 3.10.13 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-scandium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-rCHz from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools lf-activate-venv(): INFO: Adding /tmp/venv-rCHz/bin to PATH INFO: Running in OpenStack, capturing instance metadata [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins8647304100684893588.sh provisioning config files... Could not find credentials [logs] for transportpce-tox-verify-scandium #15 copy managed file [jenkins-log-archives-settings] to file:/w/workspace/transportpce-tox-verify-scandium@tmp/config1747340449546964200tmp Regular expression run condition: Expression=[^.*logs-s3.*], Label=[odl-logs-s3-cloudfront-index] Run condition [Regular expression match] enabling perform for step [Provide Configuration files] provisioning config files... copy managed file [jenkins-s3-log-ship] to file:/home/jenkins/.aws/credentials [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content SERVER_ID=logs [EnvInject] - Variables injected successfully. [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins1342586118296302829.sh ---> create-netrc.sh WARN: Log server credential not found. [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins3268685090516259242.sh ---> python-tools-install.sh Setup pyenv: system 3.8.13 3.9.13 3.10.13 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-scandium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-rCHz from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools lf-activate-venv(): INFO: Adding /tmp/venv-rCHz/bin to PATH [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins17542587792406922026.sh ---> sudo-logs.sh Archiving 'sudo' log.. [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins12227416703480697890.sh ---> job-cost.sh Setup pyenv: system 3.8.13 3.9.13 3.10.13 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-scandium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-rCHz from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: zipp==1.1.0 python-openstackclient urllib3~=1.26.15 lf-activate-venv(): INFO: Adding /tmp/venv-rCHz/bin to PATH INFO: No Stack... INFO: Retrieving Pricing Info for: v3-standard-4 INFO: Archiving Costs [transportpce-tox-verify-scandium] $ /bin/bash -l /tmp/jenkins7538457368240930023.sh ---> logs-deploy.sh Setup pyenv: system 3.8.13 3.9.13 3.10.13 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-scandium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-rCHz from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools lf-activate-venv(): INFO: Adding /tmp/venv-rCHz/bin to PATH WARNING: Nexus logging server not set INFO: S3 path logs/releng/vex-yul-odl-jenkins-1/transportpce-tox-verify-scandium/15/ INFO: archiving logs to S3 ---> uname -a: Linux prd-ubuntu2004-docker-4c-16g-25219 5.4.0-190-generic #210-Ubuntu SMP Fri Jul 5 17:03:38 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux ---> lscpu: Architecture: x86_64 CPU op-mode(s): 32-bit, 64-bit Byte Order: Little Endian Address sizes: 40 bits physical, 48 bits virtual CPU(s): 4 On-line CPU(s) list: 0-3 Thread(s) per core: 1 Core(s) per socket: 1 Socket(s): 4 NUMA node(s): 1 Vendor ID: AuthenticAMD CPU family: 23 Model: 49 Model name: AMD EPYC-Rome Processor Stepping: 0 CPU MHz: 2799.998 BogoMIPS: 5599.99 Virtualization: AMD-V Hypervisor vendor: KVM Virtualization type: full L1d cache: 128 KiB L1i cache: 128 KiB L2 cache: 2 MiB L3 cache: 64 MiB NUMA node0 CPU(s): 0-3 Vulnerability Gather data sampling: Not affected Vulnerability Itlb multihit: Not affected Vulnerability L1tf: Not affected Vulnerability Mds: Not affected Vulnerability Meltdown: Not affected Vulnerability Mmio stale data: Not affected Vulnerability Retbleed: Vulnerable Vulnerability Spec store bypass: Mitigation; Speculative Store Bypass disabled via prctl and seccomp Vulnerability Spectre v1: Mitigation; usercopy/swapgs barriers and __user pointer sanitization Vulnerability Spectre v2: Mitigation; Retpolines; IBPB conditional; IBRS_FW; STIBP disabled; RSB filling; PBRSB-eIBRS Not affected; BHI Not affected Vulnerability Srbds: Not affected Vulnerability Tsx async abort: Not affected Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy svm cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase tsc_adjust bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 clzero xsaveerptr wbnoinvd arat npt nrip_save umip rdpid arch_capabilities ---> nproc: 4 ---> df -h: Filesystem Size Used Avail Use% Mounted on udev 7.8G 0 7.8G 0% /dev tmpfs 1.6G 1.1M 1.6G 1% /run /dev/vda1 78G 17G 62G 21% / tmpfs 7.9G 0 7.9G 0% /dev/shm tmpfs 5.0M 0 5.0M 0% /run/lock tmpfs 7.9G 0 7.9G 0% /sys/fs/cgroup /dev/loop0 68M 68M 0 100% /snap/lxd/22753 /dev/loop1 62M 62M 0 100% /snap/core20/1405 /dev/vda15 105M 6.1M 99M 6% /boot/efi tmpfs 1.6G 0 1.6G 0% /run/user/1001 /dev/loop3 39M 39M 0 100% /snap/snapd/21759 /dev/loop4 64M 64M 0 100% /snap/core20/2379 /dev/loop5 92M 92M 0 100% /snap/lxd/29619 ---> free -m: total used free shared buff/cache available Mem: 15997 710 6805 0 8481 14948 Swap: 1023 0 1023 ---> ip addr: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: ens3: mtu 1458 qdisc mq state UP group default qlen 1000 link/ether fa:16:3e:0e:b0:1e brd ff:ff:ff:ff:ff:ff inet 10.30.171.178/23 brd 10.30.171.255 scope global dynamic ens3 valid_lft 80471sec preferred_lft 80471sec inet6 fe80::f816:3eff:fe0e:b01e/64 scope link valid_lft forever preferred_lft forever 3: docker0: mtu 1458 qdisc noqueue state DOWN group default link/ether 02:42:30:93:5f:3c brd ff:ff:ff:ff:ff:ff inet 10.250.0.254/24 brd 10.250.0.255 scope global docker0 valid_lft forever preferred_lft forever ---> sar -b -r -n DEV: Linux 5.4.0-190-generic (prd-ubuntu2004-docker-4c-16g-25219) 09/21/24 _x86_64_ (4 CPU) 07:55:59 LINUX RESTART (4 CPU) 07:56:01 tps rtps wtps dtps bread/s bwrtn/s bdscd/s 07:57:01 340.39 168.87 171.52 0.00 11952.27 51196.40 0.00 07:58:01 122.78 41.26 81.52 0.00 1326.45 18259.64 0.00 07:59:01 246.28 40.59 205.68 0.00 2651.42 53229.00 0.00 08:00:01 62.34 2.35 59.99 0.00 174.10 28521.11 0.00 08:01:06 72.24 0.19 72.05 0.00 10.64 54480.92 0.00 08:02:01 176.33 13.03 163.30 0.00 5015.17 133538.15 0.00 08:03:01 130.51 5.28 125.22 0.00 357.08 55266.24 0.00 08:04:01 117.45 1.72 115.74 0.00 54.75 14586.91 0.00 08:05:01 90.10 1.67 88.44 0.00 195.57 8724.01 0.00 08:06:01 123.13 0.03 123.10 0.00 1.07 4624.03 0.00 08:07:01 61.77 3.12 58.66 0.00 468.19 9274.72 0.00 08:08:01 2.60 0.02 2.58 0.00 0.13 68.12 0.00 08:09:01 39.12 0.37 38.75 0.00 19.19 829.72 0.00 08:10:01 60.41 0.00 60.41 0.00 0.00 980.10 0.00 08:11:01 3.07 0.00 3.07 0.00 0.00 64.52 0.00 08:12:01 4.57 0.92 3.65 0.00 22.13 63.86 0.00 08:13:01 12.68 0.27 12.41 0.00 26.66 1149.54 0.00 08:14:01 120.63 0.02 120.61 0.00 0.13 9766.77 0.00 08:15:01 44.56 0.00 44.56 0.00 0.00 647.49 0.00 08:16:01 1.73 0.00 1.73 0.00 0.00 31.99 0.00 08:17:01 69.51 0.05 69.46 0.00 1.07 1024.73 0.00 08:18:01 2.90 0.00 2.90 0.00 0.00 55.19 0.00 08:19:01 65.81 0.00 65.81 0.00 0.00 968.51 0.00 08:20:01 23.95 0.00 23.95 0.00 0.00 1352.17 0.00 08:21:02 54.66 0.00 54.66 0.00 0.00 2224.70 0.00 08:22:01 1.61 0.00 1.61 0.00 0.00 22.23 0.00 08:23:01 2.08 0.00 2.08 0.00 0.00 25.46 0.00 08:24:01 1.42 0.00 1.42 0.00 0.00 18.13 0.00 08:25:01 91.70 0.00 91.70 0.00 0.00 1362.97 0.00 08:26:01 53.76 0.02 53.74 0.00 0.13 782.27 0.00 08:27:01 3.72 0.00 3.72 0.00 0.00 62.39 0.00 08:28:01 76.90 0.00 76.90 0.00 0.00 1121.41 0.00 08:29:01 76.02 0.00 76.02 0.00 0.00 1104.08 0.00 08:30:01 1.70 0.00 1.70 0.00 0.00 26.26 0.00 08:31:01 78.24 0.03 78.20 0.00 0.53 1143.28 0.00 08:32:01 1.43 0.00 1.43 0.00 0.00 19.20 0.00 08:33:01 1.50 0.00 1.50 0.00 0.00 20.13 0.00 08:34:01 69.06 0.00 69.06 0.00 0.00 992.90 0.00 08:35:01 2.83 0.00 2.83 0.00 0.00 56.26 0.00 08:36:01 76.80 0.00 76.80 0.00 0.00 1119.28 0.00 08:37:01 17.39 0.00 17.39 0.00 0.00 295.63 0.00 08:38:01 56.67 0.00 56.67 0.00 0.00 809.07 0.00 08:39:01 1.77 0.00 1.77 0.00 0.00 39.85 0.00 08:40:01 15.98 0.00 15.98 0.00 0.00 274.49 0.00 08:41:01 51.16 0.00 51.16 0.00 0.00 751.34 0.00 08:42:01 1.77 0.00 1.77 0.00 0.00 34.39 0.00 08:43:01 2.28 0.00 2.28 0.00 0.00 43.33 0.00 08:44:01 2.33 0.00 2.33 0.00 0.00 41.19 0.00 08:45:01 2.48 0.00 2.48 0.00 0.00 55.72 0.00 08:46:01 2.02 0.00 2.02 0.00 0.00 38.13 0.00 08:47:01 2.43 0.00 2.43 0.00 0.00 36.26 0.00 08:48:01 1.88 0.00 1.88 0.00 0.00 43.59 0.00 08:49:01 72.37 0.00 72.37 0.00 0.00 1063.42 0.00 08:50:01 2.32 0.00 2.32 0.00 0.00 57.32 0.00 08:51:01 2.85 0.00 2.85 0.00 0.00 52.52 0.00 08:52:01 2.20 0.00 2.20 0.00 0.00 56.79 0.00 08:53:01 3.05 0.00 3.05 0.00 0.00 61.72 0.00 08:54:01 1.58 0.00 1.58 0.00 0.00 37.19 0.00 08:55:01 2.17 0.00 2.17 0.00 0.00 63.72 0.00 08:56:01 17.06 0.00 17.06 0.00 0.00 295.28 0.00 08:57:01 59.59 0.00 59.59 0.00 0.00 853.46 0.00 08:58:01 2.55 0.00 2.55 0.00 0.00 59.18 0.00 08:59:01 2.32 0.00 2.32 0.00 0.00 45.99 0.00 09:00:01 2.25 0.00 2.25 0.00 0.00 41.99 0.00 09:01:01 1.85 0.00 1.85 0.00 0.00 38.39 0.00 09:02:01 2.73 0.00 2.73 0.00 0.00 62.39 0.00 09:03:01 2.38 0.00 2.38 0.00 0.00 45.73 0.00 09:04:01 15.83 0.00 15.83 0.00 0.00 265.42 0.00 09:05:01 54.77 0.00 54.77 0.00 0.00 798.40 0.00 09:06:01 2.38 0.00 2.38 0.00 0.00 54.92 0.00 09:07:01 3.70 0.00 3.70 0.00 0.00 71.31 0.00 09:08:01 1.83 0.00 1.83 0.00 0.00 34.66 0.00 09:09:01 2.75 0.00 2.75 0.00 0.00 49.73 0.00 09:10:01 2.52 0.00 2.52 0.00 0.00 40.66 0.00 09:11:01 1.95 0.00 1.95 0.00 0.00 36.53 0.00 09:12:01 2.00 0.00 2.00 0.00 0.00 35.99 0.00 09:13:01 2.78 0.00 2.78 0.00 0.00 60.26 0.00 09:14:01 14.98 0.05 14.93 0.00 0.40 1176.60 0.00 09:15:01 102.53 0.00 102.53 0.00 0.00 8928.91 0.00 09:16:01 3.52 0.00 3.52 0.00 0.00 99.85 0.00 09:17:01 70.32 0.00 70.32 0.00 0.00 1154.34 0.00 09:18:01 2.65 0.00 2.65 0.00 0.00 55.19 0.00 09:19:01 3.20 0.00 3.20 0.00 0.00 52.66 0.00 09:20:01 2.50 0.00 2.50 0.00 0.00 46.39 0.00 09:21:01 2.28 0.00 2.28 0.00 0.00 39.19 0.00 09:22:01 2.47 0.00 2.47 0.00 0.00 56.38 0.00 09:23:01 2.85 0.00 2.85 0.00 0.00 51.99 0.00 09:24:01 2.32 0.00 2.32 0.00 0.00 32.39 0.00 09:25:01 2.15 0.00 2.15 0.00 0.00 26.93 0.00 09:26:01 1.58 0.00 1.58 0.00 0.00 19.86 0.00 09:27:01 23.76 0.00 23.76 0.00 0.00 381.54 0.00 09:28:01 53.97 0.00 53.97 0.00 0.00 874.25 0.00 09:29:01 10.10 6.93 3.17 0.00 444.73 257.29 0.00 09:30:01 4.23 0.00 4.23 0.00 0.00 131.44 0.00 09:31:01 3.18 0.00 3.18 0.00 0.00 60.12 0.00 09:32:01 1.97 0.00 1.97 0.00 0.00 23.60 0.00 09:33:01 2.22 0.00 2.22 0.00 0.00 30.13 0.00 09:34:01 13.63 3.67 9.97 0.00 74.79 1037.03 0.00 Average: 34.01 2.95 31.05 0.00 228.48 4831.55 0.00 07:56:01 kbmemfree kbavail kbmemused %memused kbbuffers kbcached kbcommit %commit kbactive kbinact kbdirty 07:57:01 13618652 15427416 568824 3.47 49568 1970416 1312860 7.53 768768 1748236 67576 07:58:01 13306628 15493984 480680 2.93 76756 2299492 1199656 6.88 748472 2038088 156676 07:59:01 10581704 14351180 1608048 9.82 136196 3701908 2357312 13.52 2070628 3301740 370800 08:00:01 9849836 15218800 741008 4.52 162216 5203132 2186788 12.55 1536992 4496380 1334404 08:01:06 5957920 13106468 2852460 17.41 181568 6899984 3712348 21.30 4297380 5558204 1709060 08:02:01 4523000 13366252 2580976 15.76 214596 8485900 3469492 19.91 4542844 6659328 96528 08:03:01 2914572 12347308 3597060 21.96 226536 9035000 4933864 28.31 5854352 6917532 164196 08:04:01 166888 8832888 7109744 43.40 218004 8290960 8140860 46.71 9206528 6314024 208 08:05:01 467628 9122180 6820440 41.64 221352 8275864 8159716 46.81 8954784 6264816 1256 08:06:01 385476 8870620 7071976 43.17 225900 8104036 8097360 46.46 9189576 6114804 680 08:07:01 167688 7633812 8307928 50.72 232852 7091160 9553380 54.81 10354776 5176168 1376 08:08:01 167872 7634020 8307708 50.71 232884 7091172 9553380 54.81 10355304 5175836 148 08:09:01 4593628 12061096 3882608 23.70 233296 7091908 5492620 31.51 5951624 5167352 816 08:10:01 2048296 9518184 6424836 39.22 235352 7092292 7383264 42.36 8492084 5164956 260 08:11:01 1891212 9361632 6581108 40.17 235388 7092784 7497848 43.02 8649512 5165012 116 08:12:01 1861288 9332696 6610128 40.35 235452 7093676 7532132 43.21 8680732 5163564 224 08:13:01 5782436 13320932 2622508 16.01 237808 7154592 3620392 20.77 4718424 5214052 44232 08:14:01 6287228 14002068 1943632 11.86 244516 7321452 2927700 16.80 4124764 5306548 308 08:15:01 5991036 13707620 2237940 13.66 246088 7321596 3060816 17.56 4422432 5303396 48 08:16:01 5977124 13693860 2251696 13.75 246112 7321720 3076840 17.65 4434940 5303484 244 08:17:01 5289876 13008588 2936484 17.93 247704 7322048 3736328 21.44 5130300 5292628 464 08:18:01 5277332 12996276 2948720 18.00 247720 7322260 3736328 21.44 5143292 5292812 36 08:19:01 6154244 13874888 2070584 12.64 249292 7322376 2886464 16.56 4269804 5292136 192 08:20:01 7086516 14873756 1072228 6.55 251568 7382576 1880048 10.79 3279340 5350416 44332 08:21:02 4685672 12474248 3470508 21.19 252572 7382872 4325452 24.82 5670992 5350688 152 08:22:01 4685564 12474156 3470604 21.19 252584 7382872 4325452 24.82 5670776 5350692 56 08:23:01 4685604 12474204 3470556 21.19 252588 7382876 4325452 24.82 5670788 5350692 68 08:24:01 4685092 12473704 3471028 21.19 252600 7382880 4325452 24.82 5671212 5350692 56 08:25:01 7087804 14878360 1067600 6.52 254132 7383240 1893492 10.86 3292520 5337072 456 08:26:01 4032328 11824124 4120120 25.15 255004 7383620 5093276 29.22 6335304 5337420 204 08:27:01 3896996 11688964 4255212 25.98 255004 7383772 5125812 29.41 6470444 5337592 104 08:28:01 5998460 13791584 2153688 13.15 255936 7383996 2950284 16.93 4376972 5337160 136 08:29:01 5948700 13743352 2201900 13.44 257156 7384296 2989180 17.15 4426780 5337100 120 08:30:01 7606756 15401332 544616 3.32 257164 7384212 1358504 7.79 2774024 5337004 112 08:31:01 5240776 13036740 2908108 17.75 257984 7384740 3713176 21.30 5131632 5337180 72 08:32:01 5240764 13036740 2908092 17.75 258000 7384740 3713176 21.30 5130828 5337180 44 08:33:01 5240480 13036476 2908360 17.75 258008 7384748 3713176 21.30 5131384 5337192 28 08:34:01 6104660 13901520 2043908 12.48 258628 7384976 2863028 16.43 4271668 5337220 556 08:35:01 5984340 13781388 2163836 13.21 258648 7385144 2945164 16.90 4391124 5337388 116 08:36:01 5218704 13016784 2927992 17.87 259188 7385620 3706756 21.27 5152416 5337820 332 08:37:01 6627020 14425424 1519952 9.28 259208 7385904 2783840 15.97 3750280 5338088 456 08:38:01 3835680 11635132 4308772 26.30 259828 7386280 5087116 29.19 6528980 5338444 284 08:39:01 3730636 11530400 4413360 26.94 259852 7386568 5185556 29.75 6633300 5338732 368 08:40:01 6927020 14727024 1218548 7.44 259876 7386780 2052076 11.77 3450560 5338848 444 08:41:01 3720048 11521004 4422640 27.00 260296 7387304 5249056 30.12 6642700 5339364 252 08:42:01 3714096 11515192 4428444 27.03 260300 7387436 5265100 30.21 6649532 5339496 192 08:43:01 3693740 11495040 4448608 27.16 260304 7387644 5281088 30.30 6668652 5339696 244 08:44:01 3689432 11490804 4452796 27.18 260312 7387700 5281088 30.30 6673572 5339760 116 08:45:01 3666704 11468556 4475008 27.32 260312 7388192 5281088 30.30 6695512 5340240 232 08:46:01 3650284 11452340 4491220 27.42 260316 7388368 5297248 30.39 6710016 5340420 84 08:47:01 3620368 11422532 4520924 27.60 260320 7388468 5313600 30.49 6741696 5340524 92 08:48:01 7524636 15326976 618812 3.78 260324 7388608 1717840 9.86 2854592 5340608 304 08:49:01 3737036 11540180 4403516 26.88 260776 7388940 5284076 30.32 6625208 5340924 336 08:50:01 3551640 11355344 4588008 28.01 260780 7389496 5382320 30.88 6808508 5341484 108 08:51:01 3520676 11325036 4618416 28.19 260796 7390124 5414836 31.07 6839076 5342116 552 08:52:01 3499972 11304680 4638760 28.32 260800 7390480 5414836 31.07 6858904 5342464 236 08:53:01 3458408 11263592 4679844 28.57 260804 7390936 5414836 31.07 6899648 5342928 224 08:54:01 3488176 11293684 4649744 28.38 260804 7391260 5414836 31.07 6869636 5343248 128 08:55:01 3456084 11262432 4681096 28.58 260808 7392092 5430844 31.16 6900584 5344084 64 08:56:01 4669796 12476012 3468204 21.17 260816 7392000 4848724 27.82 5692736 5343952 276 08:57:01 2472080 10278832 5664320 34.58 261108 7392240 6497460 37.28 7885184 5344180 180 08:58:01 2336436 10143620 5799232 35.40 261112 7392664 6595708 37.84 8019596 5344604 112 08:59:01 2247392 10054872 5887896 35.94 261116 7392952 6660744 38.21 8106712 5344892 216 09:00:01 2229028 10036640 5906128 36.05 261116 7393084 6660744 38.21 8124808 5345024 124 09:01:01 2182188 9990156 5952524 36.34 261124 7393428 6660744 38.21 8171544 5345368 364 09:02:01 2165452 9973876 5968736 36.44 261136 7393872 6677020 38.31 8187480 5345812 96 09:03:01 2138212 9946892 5995816 36.60 261136 7394132 6709004 38.49 8215152 5346068 440 09:04:01 5772744 13581496 2363368 14.43 261144 7394188 3552148 20.38 4593352 5346112 668 09:05:01 2489364 10298468 5644592 34.46 261356 7394324 6555660 37.61 7865884 5346240 68 09:06:01 2257104 10066640 5876180 35.87 261356 7394756 6685744 38.36 8095252 5346672 240 09:07:01 2177108 9987052 5955612 36.36 261372 7395148 6718552 38.55 8173792 5347064 28 09:08:01 2168368 9978432 5964296 36.41 261376 7395260 6734548 38.64 8181504 5347176 196 09:09:01 2143364 9953628 5989016 36.56 261384 7395460 6750576 38.73 8207344 5347368 120 09:10:01 2131496 9941912 6000688 36.63 261388 7395616 6766568 38.82 8219024 5347516 236 09:11:01 2106444 9917120 6025400 36.78 261388 7395876 6766568 38.82 8243244 5347776 296 09:12:01 2085056 9895808 6046696 36.91 261392 7395936 6783032 38.92 8264700 5347848 120 09:13:01 2072708 9883896 6058568 36.98 261396 7396372 6815752 39.10 8276564 5348280 320 09:14:01 6541356 14592552 1352660 8.26 267416 7621328 2146836 12.32 3635368 5529648 223328 09:15:01 3478628 11532132 4410944 26.93 267812 7623080 5233116 30.02 6696408 5517192 1564 09:16:01 3315928 11369712 4573296 27.92 267816 7623360 5349900 30.69 6858112 5517112 120 09:17:01 3767844 11821992 4121392 25.16 268092 7623404 5044992 28.94 6430448 5495796 444 09:18:01 3464312 11518860 4424088 27.01 268092 7623788 5191896 29.79 6733316 5495572 104 09:19:01 3454556 11509432 4433520 27.06 268096 7624112 5191896 29.79 6742776 5495836 120 09:20:01 3432536 11487672 4455348 27.20 268096 7624372 5207920 29.88 6764268 5496092 84 09:21:01 3428780 11484052 4458876 27.22 268104 7624508 5207920 29.88 6766916 5496224 296 09:22:01 3381788 11437596 4505216 27.50 268116 7625044 5240248 30.06 6814144 5496724 388 09:23:01 3367968 11424084 4518744 27.58 268120 7625328 5256356 30.16 6826984 5497024 240 09:24:01 3366952 11423096 4519724 27.59 268124 7625348 5256356 30.16 6827872 5497040 32 09:25:01 3367000 11423148 4519672 27.59 268124 7625352 5256356 30.16 6827560 5497044 68 09:26:01 3366976 11423124 4519728 27.59 268124 7625352 5256356 30.16 6827844 5497044 204 09:27:01 3739672 11795312 4148016 25.32 268160 7624784 5522744 31.69 6469264 5485332 192 09:28:01 2777804 10834108 5108536 31.19 268292 7625316 5908480 33.90 7429700 5485700 300 09:29:01 2601288 10679812 5260772 32.11 268840 7645404 6103864 35.02 7583704 5503012 2252 09:30:01 2530232 10609456 5331088 32.54 268848 7646088 6103864 35.02 7653404 5503704 52 09:31:01 2517064 10596644 5343844 32.62 268848 7646452 6103864 35.02 7665068 5504060 116 09:32:01 2517160 10596752 5343724 32.62 268848 7646456 6103864 35.02 7664708 5504072 284 09:33:01 2516824 10596428 5344048 32.62 268856 7646460 6103864 35.02 7664528 5504076 72 09:34:01 6710424 14989844 953144 5.82 273356 7828516 1805548 10.36 3299748 5678468 167876 Average: 4135631 11827518 4117424 25.13 249705 7299354 4984300 28.60 6257689 5322773 44936 07:56:01 IFACE rxpck/s txpck/s rxkB/s txkB/s rxcmp/s txcmp/s rxmcst/s %ifutil 07:57:01 ens3 311.80 233.28 1511.81 67.28 0.00 0.00 0.00 0.00 07:57:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 07:57:01 lo 1.97 1.97 0.18 0.18 0.00 0.00 0.00 0.00 07:58:01 ens3 65.36 54.67 684.00 7.98 0.00 0.00 0.00 0.00 07:58:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 07:58:01 lo 0.80 0.80 0.08 0.08 0.00 0.00 0.00 0.00 07:59:01 ens3 453.39 371.42 7072.85 40.12 0.00 0.00 0.00 0.00 07:59:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 07:59:01 lo 6.20 6.20 0.63 0.63 0.00 0.00 0.00 0.00 08:00:01 ens3 363.96 271.54 5523.90 27.63 0.00 0.00 0.00 0.00 08:00:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:00:01 lo 0.80 0.80 0.08 0.08 0.00 0.00 0.00 0.00 08:01:06 ens3 44.36 31.95 653.84 3.72 0.00 0.00 0.00 0.00 08:01:06 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:01:06 lo 0.74 0.74 0.07 0.07 0.00 0.00 0.00 0.00 08:02:01 ens3 266.33 159.15 4902.86 13.12 0.00 0.00 0.00 0.00 08:02:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:02:01 lo 1.45 1.45 0.14 0.14 0.00 0.00 0.00 0.00 08:03:01 ens3 2.05 1.38 0.50 0.19 0.00 0.00 0.00 0.00 08:03:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:03:01 lo 5.23 5.23 0.52 0.52 0.00 0.00 0.00 0.00 08:04:01 ens3 2.08 1.42 0.36 0.27 0.00 0.00 0.00 0.00 08:04:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:04:01 lo 43.90 43.90 47.20 47.20 0.00 0.00 0.00 0.00 08:05:01 ens3 1.52 1.25 0.45 0.38 0.00 0.00 0.00 0.00 08:05:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:05:01 lo 19.50 19.50 10.70 10.70 0.00 0.00 0.00 0.00 08:06:01 ens3 0.72 0.52 0.12 0.10 0.00 0.00 0.00 0.00 08:06:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:06:01 lo 29.78 29.78 13.56 13.56 0.00 0.00 0.00 0.00 08:07:01 ens3 1.90 1.97 0.91 0.76 0.00 0.00 0.00 0.00 08:07:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:07:01 lo 9.20 9.20 7.15 7.15 0.00 0.00 0.00 0.00 08:08:01 ens3 0.47 0.00 0.05 0.00 0.00 0.00 0.00 0.00 08:08:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:08:01 lo 0.67 0.67 0.08 0.08 0.00 0.00 0.00 0.00 08:09:01 ens3 1.23 0.85 0.39 0.31 0.00 0.00 0.00 0.00 08:09:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:09:01 lo 4.30 4.30 0.43 0.43 0.00 0.00 0.00 0.00 08:10:01 ens3 0.78 0.63 0.13 0.12 0.00 0.00 0.00 0.00 08:10:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:10:01 lo 18.35 18.35 16.82 16.82 0.00 0.00 0.00 0.00 08:11:01 ens3 0.82 0.35 0.12 0.06 0.00 0.00 0.00 0.00 08:11:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:11:01 lo 19.00 19.00 11.60 11.60 0.00 0.00 0.00 0.00 08:12:01 ens3 0.55 0.32 0.19 0.11 0.00 0.00 0.00 0.00 08:12:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:12:01 lo 19.71 19.71 9.02 9.02 0.00 0.00 0.00 0.00 08:13:01 ens3 2.80 2.27 1.14 0.95 0.00 0.00 0.00 0.00 08:13:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:13:01 lo 19.83 19.83 5.94 5.94 0.00 0.00 0.00 0.00 08:14:01 ens3 13.71 11.50 2.90 8.63 0.00 0.00 0.00 0.00 08:14:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:14:01 lo 10.36 10.36 15.65 15.65 0.00 0.00 0.00 0.00 08:15:01 ens3 1.12 0.93 0.21 0.18 0.00 0.00 0.00 0.00 08:15:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:15:01 lo 25.23 25.23 11.17 11.17 0.00 0.00 0.00 0.00 08:16:01 ens3 1.17 0.98 0.23 0.21 0.00 0.00 0.00 0.00 08:16:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:16:01 lo 22.28 22.28 7.37 7.37 0.00 0.00 0.00 0.00 08:17:01 ens3 1.17 0.90 0.28 0.20 0.00 0.00 0.00 0.00 08:17:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:17:01 lo 23.41 23.41 12.82 12.82 0.00 0.00 0.00 0.00 08:18:01 ens3 1.43 1.20 0.27 0.25 0.00 0.00 0.00 0.00 08:18:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:18:01 lo 30.49 30.49 10.30 10.30 0.00 0.00 0.00 0.00 08:19:01 ens3 1.10 1.17 0.18 0.19 0.00 0.00 0.00 0.00 08:19:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:19:01 lo 19.06 19.06 8.13 8.13 0.00 0.00 0.00 0.00 08:20:01 ens3 2.17 2.73 0.89 0.82 0.00 0.00 0.00 0.00 08:20:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:20:01 lo 24.48 24.48 7.96 7.96 0.00 0.00 0.00 0.00 08:21:02 ens3 1.35 1.73 0.25 0.27 0.00 0.00 0.00 0.00 08:21:02 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:21:02 lo 14.41 14.41 10.00 10.00 0.00 0.00 0.00 0.00 08:22:01 ens3 0.37 0.22 0.14 0.07 0.00 0.00 0.00 0.00 08:22:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:22:01 lo 0.10 0.10 0.01 0.01 0.00 0.00 0.00 0.00 08:23:01 ens3 0.23 0.07 0.01 0.01 0.00 0.00 0.00 0.00 08:23:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:23:01 lo 0.38 0.38 0.04 0.04 0.00 0.00 0.00 0.00 08:24:01 ens3 0.27 0.07 0.02 0.01 0.00 0.00 0.00 0.00 08:24:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:24:01 lo 0.35 0.35 0.04 0.04 0.00 0.00 0.00 0.00 08:25:01 ens3 1.08 1.10 0.16 0.16 0.00 0.00 0.00 0.00 08:25:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:25:01 lo 14.53 14.53 4.78 4.78 0.00 0.00 0.00 0.00 08:26:01 ens3 0.90 0.95 0.14 0.14 0.00 0.00 0.00 0.00 08:26:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:26:01 lo 14.41 14.41 6.73 6.73 0.00 0.00 0.00 0.00 08:27:01 ens3 1.37 1.43 0.36 0.29 0.00 0.00 0.00 0.00 08:27:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:27:01 lo 16.30 16.30 7.15 7.15 0.00 0.00 0.00 0.00 08:28:01 ens3 1.37 0.63 0.19 0.12 0.00 0.00 0.00 0.00 08:28:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:28:01 lo 5.38 5.38 6.33 6.33 0.00 0.00 0.00 0.00 08:29:01 ens3 1.17 0.83 0.41 0.33 0.00 0.00 0.00 0.00 08:29:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:29:01 lo 7.68 7.68 2.69 2.69 0.00 0.00 0.00 0.00 08:30:01 ens3 0.97 0.42 0.14 0.09 0.00 0.00 0.00 0.00 08:30:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:30:01 lo 4.32 4.32 1.01 1.01 0.00 0.00 0.00 0.00 08:31:01 ens3 1.12 0.72 0.39 0.30 0.00 0.00 0.00 0.00 08:31:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:31:01 lo 27.71 27.71 9.41 9.41 0.00 0.00 0.00 0.00 08:32:01 ens3 0.77 0.32 0.40 0.26 0.00 0.00 0.00 0.00 08:32:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:32:01 lo 0.12 0.12 0.01 0.01 0.00 0.00 0.00 0.00 08:33:01 ens3 0.27 0.07 0.01 0.01 0.00 0.00 0.00 0.00 08:33:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:33:01 lo 0.40 0.40 0.04 0.04 0.00 0.00 0.00 0.00 08:34:01 ens3 0.92 0.28 0.11 0.05 0.00 0.00 0.00 0.00 08:34:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:34:01 lo 2.28 2.28 0.76 0.76 0.00 0.00 0.00 0.00 08:35:01 ens3 1.52 1.17 0.45 0.38 0.00 0.00 0.00 0.00 08:35:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:35:01 lo 20.61 20.61 13.06 13.06 0.00 0.00 0.00 0.00 08:36:01 ens3 0.67 0.53 0.09 0.08 0.00 0.00 0.00 0.00 08:36:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:36:01 lo 12.61 12.61 14.81 14.81 0.00 0.00 0.00 0.00 08:37:01 ens3 1.35 1.03 0.32 0.24 0.00 0.00 0.00 0.00 08:37:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:37:01 lo 12.51 12.51 5.89 5.89 0.00 0.00 0.00 0.00 08:38:01 ens3 0.98 0.78 0.16 0.14 0.00 0.00 0.00 0.00 08:38:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:38:01 lo 33.33 33.33 13.30 13.30 0.00 0.00 0.00 0.00 08:39:01 ens3 0.67 0.53 0.11 0.10 0.00 0.00 0.00 0.00 08:39:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:39:01 lo 42.74 42.74 12.96 12.96 0.00 0.00 0.00 0.00 08:40:01 ens3 0.75 0.52 0.12 0.10 0.00 0.00 0.00 0.00 08:40:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:40:01 lo 18.65 18.65 5.72 5.72 0.00 0.00 0.00 0.00 08:41:01 ens3 0.83 0.75 0.12 0.12 0.00 0.00 0.00 0.00 08:41:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:41:01 lo 28.06 28.06 21.08 21.08 0.00 0.00 0.00 0.00 08:42:01 ens3 0.87 0.53 0.24 0.15 0.00 0.00 0.00 0.00 08:42:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:42:01 lo 8.20 8.20 3.46 3.46 0.00 0.00 0.00 0.00 08:43:01 ens3 1.35 0.67 0.20 0.13 0.00 0.00 0.00 0.00 08:43:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:43:01 lo 19.58 19.58 10.14 10.14 0.00 0.00 0.00 0.00 08:44:01 ens3 1.05 0.72 0.36 0.29 0.00 0.00 0.00 0.00 08:44:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:44:01 lo 6.30 6.30 3.99 3.99 0.00 0.00 0.00 0.00 08:45:01 ens3 0.78 0.52 0.12 0.10 0.00 0.00 0.00 0.00 08:45:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:45:01 lo 33.78 33.78 12.62 12.62 0.00 0.00 0.00 0.00 08:46:01 ens3 0.35 0.15 0.04 0.03 0.00 0.00 0.00 0.00 08:46:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:46:01 lo 8.90 8.90 4.20 4.20 0.00 0.00 0.00 0.00 08:47:01 ens3 1.05 0.87 0.30 0.21 0.00 0.00 0.00 0.00 08:47:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:47:01 lo 15.51 15.51 7.49 7.49 0.00 0.00 0.00 0.00 08:48:01 ens3 0.67 0.42 0.10 0.08 0.00 0.00 0.00 0.00 08:48:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:48:01 lo 24.16 24.16 8.24 8.24 0.00 0.00 0.00 0.00 08:49:01 ens3 0.90 0.77 0.13 0.12 0.00 0.00 0.00 0.00 08:49:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:49:01 lo 17.75 17.75 7.36 7.36 0.00 0.00 0.00 0.00 08:50:01 ens3 0.68 0.47 0.11 0.09 0.00 0.00 0.00 0.00 08:50:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:50:01 lo 36.06 36.06 12.29 12.29 0.00 0.00 0.00 0.00 08:51:01 ens3 0.48 0.37 0.08 0.06 0.00 0.00 0.00 0.00 08:51:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:51:01 lo 53.16 53.16 15.96 15.96 0.00 0.00 0.00 0.00 08:52:01 ens3 0.65 0.40 0.20 0.12 0.00 0.00 0.00 0.00 08:52:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:52:01 lo 22.78 22.78 6.57 6.57 0.00 0.00 0.00 0.00 08:53:01 ens3 0.37 0.25 0.05 0.04 0.00 0.00 0.00 0.00 08:53:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:53:01 lo 33.69 33.69 9.72 9.72 0.00 0.00 0.00 0.00 08:54:01 ens3 0.15 0.05 0.01 0.01 0.00 0.00 0.00 0.00 08:54:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:54:01 lo 20.76 20.76 5.91 5.91 0.00 0.00 0.00 0.00 08:55:01 ens3 0.25 0.08 0.01 0.01 0.00 0.00 0.00 0.00 08:55:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:55:01 lo 56.71 56.71 18.54 18.54 0.00 0.00 0.00 0.00 08:56:01 ens3 0.75 0.62 0.10 0.09 0.00 0.00 0.00 0.00 08:56:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:56:01 lo 25.78 25.78 6.96 6.96 0.00 0.00 0.00 0.00 08:57:01 ens3 1.57 0.70 0.33 0.17 0.00 0.00 0.00 0.00 08:57:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:57:01 lo 23.08 23.08 22.62 22.62 0.00 0.00 0.00 0.00 08:58:01 ens3 1.23 0.75 0.43 0.33 0.00 0.00 0.00 0.00 08:58:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:58:01 lo 33.12 33.12 13.83 13.83 0.00 0.00 0.00 0.00 08:59:01 ens3 1.78 0.68 0.27 0.13 0.00 0.00 0.00 0.00 08:59:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:59:01 lo 18.15 18.15 7.52 7.52 0.00 0.00 0.00 0.00 09:00:01 ens3 1.58 0.92 0.49 0.37 0.00 0.00 0.00 0.00 09:00:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:00:01 lo 11.15 11.15 5.39 5.39 0.00 0.00 0.00 0.00 09:01:01 ens3 0.80 0.37 0.12 0.06 0.00 0.00 0.00 0.00 09:01:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:01:01 lo 37.58 37.58 13.83 13.83 0.00 0.00 0.00 0.00 09:02:01 ens3 1.18 0.63 0.30 0.17 0.00 0.00 0.00 0.00 09:02:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:02:01 lo 29.10 29.10 11.78 11.78 0.00 0.00 0.00 0.00 09:03:01 ens3 0.90 0.40 0.13 0.07 0.00 0.00 0.00 0.00 09:03:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:03:01 lo 24.50 24.50 8.94 8.94 0.00 0.00 0.00 0.00 09:04:01 ens3 1.65 0.88 0.43 0.31 0.00 0.00 0.00 0.00 09:04:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:04:01 lo 4.57 4.57 2.12 2.12 0.00 0.00 0.00 0.00 09:05:01 ens3 0.85 0.65 0.11 0.10 0.00 0.00 0.00 0.00 09:05:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:05:01 lo 22.65 22.65 22.56 22.56 0.00 0.00 0.00 0.00 09:06:01 ens3 1.03 0.55 0.17 0.11 0.00 0.00 0.00 0.00 09:06:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:06:01 lo 33.58 33.58 14.10 14.10 0.00 0.00 0.00 0.00 09:07:01 ens3 1.23 0.70 0.50 0.35 0.00 0.00 0.00 0.00 09:07:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:07:01 lo 19.21 19.21 9.97 9.97 0.00 0.00 0.00 0.00 09:08:01 ens3 0.90 0.62 0.15 0.12 0.00 0.00 0.00 0.00 09:08:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:08:01 lo 10.10 10.10 5.70 5.70 0.00 0.00 0.00 0.00 09:09:01 ens3 1.08 0.48 0.12 0.09 0.00 0.00 0.00 0.00 09:09:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:09:01 lo 23.70 23.70 12.39 12.39 0.00 0.00 0.00 0.00 09:10:01 ens3 1.52 0.42 0.14 0.09 0.00 0.00 0.00 0.00 09:10:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:10:01 lo 8.58 8.58 5.74 5.74 0.00 0.00 0.00 0.00 09:11:01 ens3 1.23 0.47 0.18 0.09 0.00 0.00 0.00 0.00 09:11:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:11:01 lo 24.35 24.35 12.02 12.02 0.00 0.00 0.00 0.00 09:12:01 ens3 1.27 0.72 0.52 0.37 0.00 0.00 0.00 0.00 09:12:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:12:01 lo 6.77 6.77 4.83 4.83 0.00 0.00 0.00 0.00 09:13:01 ens3 0.42 0.22 0.04 0.03 0.00 0.00 0.00 0.00 09:13:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:13:01 lo 38.33 38.33 13.85 13.85 0.00 0.00 0.00 0.00 09:14:01 ens3 2.38 2.08 0.89 0.75 0.00 0.00 0.00 0.00 09:14:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:14:01 lo 5.38 5.38 2.62 2.62 0.00 0.00 0.00 0.00 09:15:01 ens3 1.10 0.82 0.38 0.31 0.00 0.00 0.00 0.00 09:15:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:15:01 lo 35.63 35.63 32.80 32.80 0.00 0.00 0.00 0.00 09:16:01 ens3 1.52 1.15 0.28 0.24 0.00 0.00 0.00 0.00 09:16:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:16:01 lo 35.21 35.21 14.29 14.29 0.00 0.00 0.00 0.00 09:17:01 ens3 1.12 1.07 0.26 0.20 0.00 0.00 0.00 0.00 09:17:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:17:01 lo 10.01 10.01 8.53 8.53 0.00 0.00 0.00 0.00 09:18:01 ens3 0.88 1.03 0.16 0.16 0.00 0.00 0.00 0.00 09:18:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:18:01 lo 31.18 31.18 11.69 11.69 0.00 0.00 0.00 0.00 09:19:01 ens3 0.77 0.98 0.15 0.15 0.00 0.00 0.00 0.00 09:19:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:19:01 lo 12.25 12.25 5.31 5.31 0.00 0.00 0.00 0.00 09:20:01 ens3 1.65 0.72 0.23 0.12 0.00 0.00 0.00 0.00 09:20:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:20:01 lo 26.30 26.30 10.61 10.61 0.00 0.00 0.00 0.00 09:21:01 ens3 1.60 1.20 0.68 0.55 0.00 0.00 0.00 0.00 09:21:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:21:01 lo 16.33 16.33 5.78 5.78 0.00 0.00 0.00 0.00 09:22:01 ens3 0.93 0.68 0.27 0.19 0.00 0.00 0.00 0.00 09:22:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:22:01 lo 43.70 43.70 15.24 15.24 0.00 0.00 0.00 0.00 09:23:01 ens3 0.67 0.50 0.11 0.09 0.00 0.00 0.00 0.00 09:23:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:23:01 lo 34.63 34.63 11.18 11.18 0.00 0.00 0.00 0.00 09:24:01 ens3 0.47 0.32 0.08 0.06 0.00 0.00 0.00 0.00 09:24:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:24:01 lo 3.47 3.47 0.99 0.99 0.00 0.00 0.00 0.00 09:25:01 ens3 0.43 0.10 0.03 0.01 0.00 0.00 0.00 0.00 09:25:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:25:01 lo 0.58 0.58 0.06 0.06 0.00 0.00 0.00 0.00 09:26:01 ens3 0.15 0.00 0.01 0.00 0.00 0.00 0.00 0.00 09:26:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:26:01 lo 0.27 0.27 0.02 0.02 0.00 0.00 0.00 0.00 09:27:01 ens3 1.02 0.87 0.23 0.16 0.00 0.00 0.00 0.00 09:27:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:27:01 lo 2.38 2.38 0.21 0.21 0.00 0.00 0.00 0.00 09:28:01 ens3 0.78 0.87 0.14 0.14 0.00 0.00 0.00 0.00 09:28:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:28:01 lo 45.51 45.51 22.43 22.43 0.00 0.00 0.00 0.00 09:29:01 ens3 2.73 2.85 16.71 0.30 0.00 0.00 0.00 0.00 09:29:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:29:01 lo 69.89 69.89 25.01 25.01 0.00 0.00 0.00 0.00 09:30:01 ens3 0.62 0.60 0.09 0.11 0.00 0.00 0.00 0.00 09:30:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:30:01 lo 58.36 58.36 20.18 20.18 0.00 0.00 0.00 0.00 09:31:01 ens3 0.42 0.28 0.05 0.04 0.00 0.00 0.00 0.00 09:31:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:31:01 lo 41.21 41.21 13.18 13.18 0.00 0.00 0.00 0.00 09:32:01 ens3 1.62 0.48 0.50 0.28 0.00 0.00 0.00 0.00 09:32:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:32:01 lo 0.40 0.40 0.03 0.03 0.00 0.00 0.00 0.00 09:33:01 ens3 1.17 0.53 0.58 0.43 0.00 0.00 0.00 0.00 09:33:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:33:01 lo 0.77 0.77 0.08 0.08 0.00 0.00 0.00 0.00 09:34:01 ens3 132.84 102.20 1934.69 11.15 0.00 0.00 0.00 0.00 09:34:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:34:01 lo 1.08 1.08 0.10 0.10 0.00 0.00 0.00 0.00 Average: ens3 17.64 13.19 224.28 2.00 0.00 0.00 0.00 0.00 Average: docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 Average: lo 18.98 18.98 8.72 8.72 0.00 0.00 0.00 0.00 ---> sar -P ALL: Linux 5.4.0-190-generic (prd-ubuntu2004-docker-4c-16g-25219) 09/21/24 _x86_64_ (4 CPU) 07:55:59 LINUX RESTART (4 CPU) 07:56:01 CPU %user %nice %system %iowait %steal %idle 07:57:01 all 13.79 17.77 14.61 5.02 0.12 48.68 07:57:01 0 11.28 18.07 15.83 4.67 0.13 50.02 07:57:01 1 12.11 18.10 14.79 6.46 0.12 48.42 07:57:01 2 15.16 17.79 14.05 2.92 0.12 49.97 07:57:01 3 16.63 17.14 13.78 6.02 0.12 46.31 07:58:01 all 18.67 5.66 5.17 2.57 0.08 67.85 07:58:01 0 27.65 5.86 5.56 0.97 0.08 59.87 07:58:01 1 19.55 5.78 5.88 4.31 0.08 64.41 07:58:01 2 4.99 6.84 4.81 3.06 0.07 80.24 07:58:01 3 22.44 4.16 4.44 1.95 0.08 66.93 07:59:01 all 71.93 0.00 4.56 4.53 0.13 18.85 07:59:01 0 74.89 0.00 4.07 2.60 0.13 18.30 07:59:01 1 69.99 0.00 4.52 4.76 0.15 20.58 07:59:01 2 74.18 0.00 4.90 6.38 0.12 14.43 07:59:01 3 68.62 0.00 4.74 4.39 0.12 22.13 08:00:01 all 56.15 0.00 2.63 1.40 0.11 39.71 08:00:01 0 46.93 0.00 2.11 2.49 0.08 48.38 08:00:01 1 82.43 0.00 4.66 0.80 0.12 11.99 08:00:01 2 40.67 0.00 1.72 1.34 0.08 56.18 08:00:01 3 54.97 0.00 2.08 0.97 0.13 41.85 08:01:06 all 50.28 0.00 2.31 40.81 0.12 6.49 08:01:06 0 46.68 0.00 2.34 50.41 0.11 0.46 08:01:06 1 45.97 0.00 1.77 42.10 0.11 10.05 08:01:06 2 56.05 0.00 3.02 34.50 0.13 6.30 08:01:06 3 52.39 0.00 2.10 36.22 0.13 9.17 08:02:01 all 47.92 0.00 3.51 38.51 0.09 9.97 08:02:01 0 49.63 0.00 3.82 44.71 0.09 1.75 08:02:01 1 47.61 0.00 3.71 28.18 0.09 20.41 08:02:01 2 46.04 0.00 3.43 40.50 0.09 9.93 08:02:01 3 48.42 0.00 3.07 40.63 0.09 7.79 08:03:01 all 89.00 0.00 3.11 2.15 0.11 5.62 08:03:01 0 92.63 0.00 3.38 1.14 0.10 2.76 08:03:01 1 88.69 0.00 2.76 1.66 0.12 6.78 08:03:01 2 87.43 0.00 2.57 2.34 0.10 7.56 08:03:01 3 87.27 0.00 3.73 3.48 0.13 5.40 08:04:01 all 45.98 0.00 1.79 1.26 0.10 50.87 08:04:01 0 44.49 0.00 1.46 1.44 0.10 52.51 08:04:01 1 46.48 0.00 1.67 0.02 0.10 51.72 08:04:01 2 48.39 0.00 1.86 3.36 0.10 46.28 08:04:01 3 44.56 0.00 2.18 0.20 0.10 52.96 08:05:01 all 35.48 0.00 1.32 0.34 0.09 62.77 08:05:01 0 34.43 0.00 1.51 0.03 0.08 63.94 08:05:01 1 37.40 0.00 1.42 0.10 0.10 60.98 08:05:01 2 34.07 0.00 1.06 0.72 0.08 64.07 08:05:01 3 36.01 0.00 1.29 0.52 0.10 62.08 08:06:01 all 30.81 0.00 1.04 0.60 0.09 67.46 08:06:01 0 31.22 0.00 0.96 0.35 0.10 67.37 08:06:01 1 30.43 0.00 0.92 0.96 0.08 67.61 08:06:01 2 31.27 0.00 1.06 1.06 0.08 66.53 08:06:01 3 30.34 0.00 1.21 0.02 0.08 68.35 08:07:01 all 35.75 0.00 1.42 0.62 0.08 62.13 08:07:01 0 36.11 0.00 1.49 0.15 0.08 62.17 08:07:01 1 37.73 0.00 1.21 0.02 0.07 60.98 08:07:01 2 31.25 0.00 1.19 2.17 0.08 65.31 08:07:01 3 37.94 0.00 1.79 0.13 0.10 60.03 08:07:01 CPU %user %nice %system %iowait %steal %idle 08:08:01 all 0.76 0.00 0.33 0.02 0.06 98.83 08:08:01 0 0.79 0.00 0.45 0.00 0.07 98.69 08:08:01 1 0.55 0.00 0.17 0.00 0.05 99.23 08:08:01 2 0.93 0.00 0.42 0.08 0.05 98.52 08:08:01 3 0.75 0.00 0.28 0.00 0.07 98.90 08:09:01 all 45.34 0.00 1.69 0.04 0.10 52.83 08:09:01 0 47.65 0.00 1.71 0.03 0.10 50.50 08:09:01 1 41.31 0.00 1.89 0.02 0.10 56.68 08:09:01 2 46.12 0.00 1.54 0.05 0.10 52.19 08:09:01 3 46.27 0.00 1.63 0.05 0.10 51.95 08:10:01 all 32.23 0.00 1.17 0.27 0.09 66.24 08:10:01 0 33.31 0.00 1.10 0.02 0.08 65.49 08:10:01 1 30.24 0.00 0.89 0.08 0.08 68.70 08:10:01 2 32.73 0.00 1.22 0.93 0.08 65.03 08:10:01 3 32.66 0.00 1.49 0.03 0.10 65.71 08:11:01 all 7.98 0.00 0.47 0.01 0.07 91.47 08:11:01 0 7.20 0.00 0.43 0.02 0.07 92.28 08:11:01 1 6.77 0.00 0.43 0.00 0.07 92.72 08:11:01 2 8.74 0.00 0.53 0.02 0.08 90.63 08:11:01 3 9.20 0.00 0.48 0.00 0.07 90.25 08:12:01 all 4.05 0.00 0.42 0.02 0.07 95.44 08:12:01 0 4.09 0.00 0.38 0.03 0.07 95.42 08:12:01 1 3.90 0.00 0.52 0.00 0.08 95.50 08:12:01 2 4.03 0.00 0.42 0.05 0.07 95.43 08:12:01 3 4.16 0.00 0.37 0.00 0.07 95.41 08:13:01 all 4.61 0.00 0.63 0.15 0.07 94.53 08:13:01 0 2.78 0.00 0.49 0.12 0.07 96.55 08:13:01 1 10.19 0.00 0.94 0.10 0.07 88.71 08:13:01 2 2.51 0.00 0.57 0.38 0.07 96.47 08:13:01 3 2.97 0.00 0.54 0.02 0.07 96.41 08:14:01 all 56.34 0.00 1.96 0.81 0.10 40.78 08:14:01 0 54.33 0.00 2.19 0.59 0.10 42.79 08:14:01 1 57.48 0.00 1.84 0.15 0.10 40.43 08:14:01 2 57.79 0.00 1.89 1.41 0.10 38.81 08:14:01 3 55.76 0.00 1.93 1.11 0.10 41.10 08:15:01 all 11.98 0.00 0.40 0.24 0.06 87.31 08:15:01 0 12.24 0.00 0.28 0.00 0.05 87.43 08:15:01 1 11.28 0.00 0.33 0.00 0.07 88.32 08:15:01 2 12.00 0.00 0.45 0.57 0.07 86.91 08:15:01 3 12.42 0.00 0.53 0.40 0.07 86.58 08:16:01 all 2.46 0.00 0.27 0.01 0.05 97.21 08:16:01 0 2.44 0.00 0.18 0.00 0.05 97.32 08:16:01 1 2.35 0.00 0.25 0.00 0.05 97.35 08:16:01 2 2.40 0.00 0.42 0.02 0.07 97.10 08:16:01 3 2.66 0.00 0.22 0.02 0.05 97.06 08:17:01 all 37.38 0.00 1.18 0.32 0.09 61.03 08:17:01 0 36.62 0.00 0.89 0.03 0.08 62.38 08:17:01 1 36.34 0.00 1.02 0.02 0.08 62.53 08:17:01 2 38.20 0.00 1.54 0.82 0.08 59.35 08:17:01 3 38.36 0.00 1.27 0.40 0.10 59.86 08:18:01 all 4.13 0.00 0.20 0.01 0.06 95.60 08:18:01 0 4.34 0.00 0.24 0.02 0.07 95.34 08:18:01 1 3.68 0.00 0.18 0.00 0.05 96.08 08:18:01 2 3.98 0.00 0.13 0.03 0.07 95.79 08:18:01 3 4.52 0.00 0.23 0.00 0.07 95.18 08:18:01 CPU %user %nice %system %iowait %steal %idle 08:19:01 all 28.93 0.00 1.03 0.27 0.08 69.69 08:19:01 0 27.54 0.00 1.06 0.57 0.08 70.75 08:19:01 1 31.08 0.00 1.19 0.02 0.08 67.63 08:19:01 2 27.07 0.00 1.19 0.12 0.07 71.56 08:19:01 3 30.02 0.00 0.70 0.38 0.07 68.83 08:20:01 all 20.55 0.00 0.88 0.13 0.08 78.36 08:20:01 0 18.95 0.00 0.90 0.02 0.08 80.04 08:20:01 1 25.13 0.00 1.02 0.05 0.08 73.72 08:20:01 2 20.80 0.00 0.82 0.45 0.08 77.85 08:20:01 3 17.32 0.00 0.79 0.00 0.07 81.83 08:21:02 all 29.20 0.00 1.00 0.58 0.08 69.15 08:21:02 0 30.53 0.00 1.22 1.05 0.08 67.12 08:21:02 1 28.99 0.00 0.84 0.55 0.07 69.55 08:21:02 2 28.29 0.00 0.83 0.65 0.07 70.16 08:21:02 3 28.98 0.00 1.12 0.05 0.08 69.78 08:22:01 all 0.60 0.00 0.20 0.02 0.05 99.13 08:22:01 0 0.81 0.00 0.24 0.02 0.05 98.88 08:22:01 1 0.44 0.00 0.27 0.00 0.07 99.22 08:22:01 2 0.85 0.00 0.14 0.05 0.05 98.91 08:22:01 3 0.29 0.00 0.15 0.00 0.03 99.52 08:23:01 all 0.50 0.00 0.20 0.01 0.05 99.24 08:23:01 0 0.32 0.00 0.13 0.02 0.03 99.50 08:23:01 1 0.43 0.00 0.13 0.00 0.07 99.36 08:23:01 2 0.65 0.00 0.27 0.00 0.07 99.01 08:23:01 3 0.60 0.00 0.25 0.02 0.05 99.08 08:24:01 all 0.53 0.00 0.19 0.00 0.05 99.22 08:24:01 0 0.58 0.00 0.20 0.02 0.05 99.15 08:24:01 1 0.50 0.00 0.18 0.00 0.07 99.25 08:24:01 2 0.49 0.00 0.13 0.00 0.05 99.33 08:24:01 3 0.57 0.00 0.23 0.00 0.05 99.15 08:25:01 all 46.38 0.00 1.64 0.38 0.09 51.52 08:25:01 0 46.81 0.00 1.76 1.31 0.08 50.04 08:25:01 1 47.00 0.00 1.56 0.02 0.10 51.32 08:25:01 2 46.02 0.00 1.42 0.17 0.08 52.30 08:25:01 3 45.68 0.00 1.81 0.02 0.08 52.41 08:26:01 all 35.51 0.00 1.17 0.25 0.10 62.97 08:26:01 0 36.32 0.00 1.15 0.50 0.10 61.93 08:26:01 1 32.36 0.00 1.12 0.33 0.10 66.09 08:26:01 2 35.76 0.00 0.95 0.02 0.08 63.19 08:26:01 3 37.59 0.00 1.46 0.17 0.10 60.68 08:27:01 all 3.98 0.00 0.30 0.01 0.05 95.65 08:27:01 0 3.82 0.00 0.22 0.02 0.05 95.89 08:27:01 1 3.30 0.00 0.38 0.00 0.05 96.26 08:27:01 2 4.54 0.00 0.23 0.00 0.05 95.17 08:27:01 3 4.26 0.00 0.38 0.03 0.05 95.28 08:28:01 all 29.10 0.00 1.00 0.23 0.09 69.57 08:28:01 0 30.73 0.00 0.62 0.00 0.08 68.57 08:28:01 1 27.02 0.00 1.21 0.39 0.10 71.29 08:28:01 2 29.17 0.00 1.27 0.03 0.10 69.42 08:28:01 3 29.46 0.00 0.92 0.52 0.08 69.01 08:29:01 all 27.70 0.00 0.99 0.31 0.08 70.93 08:29:01 0 27.01 0.00 1.22 0.38 0.07 71.32 08:29:01 1 26.72 0.00 1.11 0.05 0.07 72.06 08:29:01 2 28.62 0.00 0.82 0.05 0.08 70.43 08:29:01 3 28.47 0.00 0.81 0.74 0.08 69.90 08:29:01 CPU %user %nice %system %iowait %steal %idle 08:30:01 all 1.82 0.00 0.17 0.01 0.05 97.95 08:30:01 0 1.81 0.00 0.28 0.00 0.05 97.86 08:30:01 1 1.66 0.00 0.13 0.00 0.05 98.15 08:30:01 2 1.71 0.00 0.10 0.00 0.05 98.14 08:30:01 3 2.11 0.00 0.15 0.03 0.05 97.66 08:31:01 all 39.58 0.00 1.22 0.30 0.09 58.81 08:31:01 0 41.53 0.00 1.30 0.02 0.08 57.07 08:31:01 1 41.01 0.00 1.48 0.15 0.08 57.28 08:31:01 2 39.31 0.00 0.95 0.00 0.10 59.64 08:31:01 3 36.46 0.00 1.17 1.02 0.10 61.24 08:32:01 all 0.87 0.00 0.12 0.00 0.05 98.95 08:32:01 0 0.10 0.00 0.05 0.00 0.05 99.80 08:32:01 1 0.39 0.00 0.20 0.00 0.07 99.35 08:32:01 2 2.29 0.00 0.10 0.00 0.05 97.56 08:32:01 3 0.69 0.00 0.12 0.02 0.05 99.13 08:33:01 all 0.82 0.00 0.09 0.00 0.05 99.04 08:33:01 0 0.30 0.00 0.08 0.00 0.03 99.58 08:33:01 1 0.35 0.00 0.12 0.00 0.05 99.48 08:33:01 2 2.26 0.00 0.03 0.00 0.05 97.66 08:33:01 3 0.34 0.00 0.13 0.02 0.05 99.46 08:34:01 all 27.09 0.00 1.00 0.23 0.06 71.62 08:34:01 0 27.63 0.00 1.15 0.38 0.07 70.77 08:34:01 1 24.60 0.00 0.72 0.02 0.05 74.61 08:34:01 2 26.92 0.00 0.91 0.02 0.07 72.08 08:34:01 3 29.22 0.00 1.22 0.49 0.07 69.00 08:35:01 all 6.86 0.00 0.34 0.02 0.06 92.72 08:35:01 0 6.30 0.00 0.42 0.03 0.07 93.18 08:35:01 1 7.22 0.00 0.32 0.00 0.07 92.39 08:35:01 2 6.68 0.00 0.20 0.00 0.05 93.07 08:35:01 3 7.24 0.00 0.43 0.03 0.07 92.22 08:36:01 all 39.15 0.00 1.24 0.26 0.09 59.25 08:36:01 0 40.67 0.00 0.91 0.02 0.08 58.32 08:36:01 1 41.28 0.00 1.02 0.39 0.10 57.21 08:36:01 2 40.57 0.00 1.59 0.02 0.08 57.73 08:36:01 3 34.11 0.00 1.44 0.63 0.10 63.72 08:37:01 all 22.65 0.00 0.85 0.03 0.08 76.38 08:37:01 0 24.29 0.00 0.82 0.03 0.08 74.77 08:37:01 1 22.09 0.00 0.87 0.05 0.08 76.90 08:37:01 2 23.25 0.00 0.84 0.02 0.08 75.81 08:37:01 3 20.97 0.00 0.89 0.03 0.07 78.04 08:38:01 all 34.13 0.00 0.97 0.30 0.09 64.51 08:38:01 0 35.40 0.00 1.22 1.20 0.10 62.08 08:38:01 1 35.01 0.00 1.16 0.00 0.08 63.75 08:38:01 2 33.78 0.00 0.79 0.00 0.08 65.35 08:38:01 3 32.35 0.00 0.70 0.00 0.08 66.87 08:39:01 all 5.46 0.00 0.25 0.01 0.06 94.23 08:39:01 0 5.16 0.00 0.23 0.03 0.07 94.50 08:39:01 1 5.67 0.00 0.34 0.00 0.07 93.93 08:39:01 2 5.47 0.00 0.25 0.00 0.05 94.23 08:39:01 3 5.54 0.00 0.17 0.00 0.05 94.24 08:40:01 all 17.18 0.00 0.69 0.03 0.07 82.04 08:40:01 0 16.78 0.00 0.74 0.05 0.07 82.36 08:40:01 1 18.43 0.00 0.67 0.02 0.07 80.82 08:40:01 2 16.72 0.00 0.39 0.00 0.08 82.81 08:40:01 3 16.77 0.00 0.97 0.03 0.07 82.16 08:40:01 CPU %user %nice %system %iowait %steal %idle 08:41:01 all 38.64 0.00 1.18 0.23 0.09 59.85 08:41:01 0 40.38 0.00 1.51 0.72 0.08 57.31 08:41:01 1 37.83 0.00 1.33 0.00 0.10 60.74 08:41:01 2 40.45 0.00 1.00 0.22 0.08 58.25 08:41:01 3 35.91 0.00 0.90 0.00 0.08 63.11 08:42:01 all 2.39 0.00 0.17 0.01 0.06 97.38 08:42:01 0 2.60 0.00 0.25 0.03 0.07 97.05 08:42:01 1 2.42 0.00 0.15 0.00 0.05 97.38 08:42:01 2 2.68 0.00 0.18 0.00 0.07 97.07 08:42:01 3 1.85 0.00 0.08 0.00 0.05 98.02 08:43:01 all 3.19 0.00 0.16 0.01 0.07 96.58 08:43:01 0 3.12 0.00 0.20 0.03 0.08 96.56 08:43:01 1 3.09 0.00 0.13 0.00 0.05 96.73 08:43:01 2 3.18 0.00 0.17 0.00 0.07 96.58 08:43:01 3 3.35 0.00 0.13 0.00 0.07 96.45 08:44:01 all 0.95 0.00 0.13 0.01 0.04 98.86 08:44:01 0 1.07 0.00 0.17 0.03 0.05 98.67 08:44:01 1 1.12 0.00 0.17 0.00 0.03 98.68 08:44:01 2 0.86 0.00 0.13 0.00 0.05 98.96 08:44:01 3 0.74 0.00 0.07 0.02 0.03 99.14 08:45:01 all 3.42 0.00 0.23 0.01 0.06 96.28 08:45:01 0 3.29 0.00 0.23 0.02 0.05 96.41 08:45:01 1 3.47 0.00 0.27 0.00 0.07 96.20 08:45:01 2 3.52 0.00 0.22 0.00 0.07 96.20 08:45:01 3 3.41 0.00 0.18 0.03 0.07 96.31 08:46:01 all 1.74 0.00 0.16 0.02 0.05 98.03 08:46:01 0 1.17 0.00 0.15 0.02 0.05 98.61 08:46:01 1 1.37 0.00 0.23 0.00 0.07 98.33 08:46:01 2 1.76 0.00 0.13 0.00 0.03 98.07 08:46:01 3 2.63 0.00 0.13 0.07 0.05 97.12 08:47:01 all 1.98 0.00 0.19 0.01 0.06 97.76 08:47:01 0 3.00 0.00 0.17 0.02 0.05 96.77 08:47:01 1 1.42 0.00 0.32 0.00 0.08 98.17 08:47:01 2 1.88 0.00 0.10 0.00 0.05 97.97 08:47:01 3 1.59 0.00 0.18 0.03 0.05 98.14 08:48:01 all 3.07 0.00 0.35 0.01 0.05 96.51 08:48:01 0 3.83 0.00 0.32 0.02 0.05 95.78 08:48:01 1 2.42 0.00 0.39 0.00 0.05 97.15 08:48:01 2 2.86 0.00 0.43 0.00 0.05 96.66 08:48:01 3 3.17 0.00 0.27 0.03 0.07 96.46 08:49:01 all 48.86 0.00 1.32 0.31 0.08 49.42 08:49:01 0 51.18 0.00 1.45 0.02 0.08 47.27 08:49:01 1 43.78 0.00 1.29 0.05 0.08 54.80 08:49:01 2 50.51 0.00 1.32 0.15 0.08 47.93 08:49:01 3 49.97 0.00 1.22 1.04 0.08 47.68 08:50:01 all 7.23 0.00 0.26 0.01 0.06 92.44 08:50:01 0 7.48 0.00 0.28 0.00 0.05 92.18 08:50:01 1 7.31 0.00 0.30 0.00 0.07 92.32 08:50:01 2 7.20 0.00 0.27 0.02 0.07 92.44 08:50:01 3 6.93 0.00 0.20 0.03 0.05 92.79 08:51:01 all 5.05 0.00 0.28 0.01 0.06 94.59 08:51:01 0 5.06 0.00 0.25 0.00 0.07 94.62 08:51:01 1 4.83 0.00 0.37 0.00 0.07 94.73 08:51:01 2 5.02 0.00 0.23 0.02 0.05 94.68 08:51:01 3 5.31 0.00 0.27 0.02 0.07 94.34 08:51:01 CPU %user %nice %system %iowait %steal %idle 08:52:01 all 2.01 0.00 0.16 0.01 0.05 97.78 08:52:01 0 1.86 0.00 0.17 0.00 0.05 97.92 08:52:01 1 1.94 0.00 0.13 0.00 0.03 97.89 08:52:01 2 2.23 0.00 0.18 0.02 0.05 97.52 08:52:01 3 2.01 0.00 0.13 0.03 0.05 97.77 08:53:01 all 2.52 0.00 0.20 0.02 0.07 97.19 08:53:01 0 2.61 0.00 0.23 0.00 0.07 97.09 08:53:01 1 2.73 0.00 0.18 0.02 0.07 97.01 08:53:01 2 2.41 0.00 0.23 0.05 0.07 97.23 08:53:01 3 2.34 0.00 0.15 0.00 0.07 97.44 08:54:01 all 1.33 0.00 0.17 0.00 0.05 98.45 08:54:01 0 1.28 0.00 0.22 0.00 0.05 98.45 08:54:01 1 1.19 0.00 0.22 0.02 0.05 98.52 08:54:01 2 1.24 0.00 0.13 0.00 0.03 98.60 08:54:01 3 1.62 0.00 0.12 0.00 0.05 98.21 08:55:01 all 4.32 0.00 0.26 0.01 0.05 95.36 08:55:01 0 4.90 0.00 0.30 0.00 0.05 94.74 08:55:01 1 3.81 0.00 0.28 0.03 0.03 95.83 08:55:01 2 3.81 0.00 0.20 0.00 0.03 95.96 08:55:01 3 4.76 0.00 0.27 0.02 0.07 94.89 08:56:01 all 38.60 0.00 1.32 0.02 0.08 59.99 08:56:01 0 38.70 0.00 1.29 0.00 0.08 59.92 08:56:01 1 39.10 0.00 1.36 0.03 0.08 59.42 08:56:01 2 38.36 0.00 1.39 0.03 0.08 60.13 08:56:01 3 38.22 0.00 1.22 0.00 0.08 60.48 08:57:01 all 28.77 0.00 0.79 0.28 0.08 70.10 08:57:01 0 27.61 0.00 0.89 0.02 0.08 71.40 08:57:01 1 30.56 0.00 0.79 0.02 0.07 68.57 08:57:01 2 28.56 0.00 0.70 0.00 0.08 70.65 08:57:01 3 28.33 0.00 0.77 1.07 0.07 69.76 08:58:01 all 8.20 0.00 0.28 0.02 0.06 91.44 08:58:01 0 8.81 0.00 0.25 0.02 0.05 90.87 08:58:01 1 7.65 0.00 0.32 0.00 0.07 91.96 08:58:01 2 7.65 0.00 0.32 0.00 0.07 91.96 08:58:01 3 8.68 0.00 0.22 0.07 0.07 90.97 08:59:01 all 4.23 0.00 0.18 0.01 0.05 95.52 08:59:01 0 3.59 0.00 0.22 0.00 0.05 96.14 08:59:01 1 4.96 0.00 0.17 0.00 0.05 94.83 08:59:01 2 4.18 0.00 0.18 0.00 0.05 95.59 08:59:01 3 4.20 0.00 0.17 0.03 0.07 95.53 09:00:01 all 2.49 0.00 0.18 0.01 0.05 97.27 09:00:01 0 2.20 0.00 0.20 0.00 0.07 97.53 09:00:01 1 2.32 0.00 0.15 0.00 0.05 97.48 09:00:01 2 2.61 0.00 0.20 0.00 0.05 97.14 09:00:01 3 2.85 0.00 0.15 0.03 0.05 96.92 09:01:01 all 4.58 0.00 0.23 0.01 0.06 95.13 09:01:01 0 4.10 0.00 0.20 0.00 0.05 95.65 09:01:01 1 5.25 0.00 0.23 0.00 0.07 94.45 09:01:01 2 4.37 0.00 0.30 0.00 0.07 95.26 09:01:01 3 4.59 0.00 0.17 0.03 0.05 95.16 09:02:01 all 2.31 0.00 0.23 0.01 0.05 97.40 09:02:01 0 2.38 0.00 0.25 0.00 0.05 97.32 09:02:01 1 2.36 0.00 0.33 0.00 0.07 97.24 09:02:01 2 2.31 0.00 0.17 0.00 0.05 97.47 09:02:01 3 2.17 0.00 0.17 0.03 0.05 97.57 09:02:01 CPU %user %nice %system %iowait %steal %idle 09:03:01 all 2.24 0.00 0.19 0.02 0.06 97.50 09:03:01 0 2.36 0.00 0.20 0.02 0.05 97.38 09:03:01 1 1.71 0.00 0.20 0.00 0.05 98.04 09:03:01 2 2.38 0.00 0.20 0.02 0.07 97.34 09:03:01 3 2.51 0.00 0.15 0.03 0.07 97.24 09:04:01 all 26.54 0.00 1.08 0.02 0.08 72.28 09:04:01 0 26.40 0.00 1.21 0.02 0.10 72.28 09:04:01 1 28.70 0.00 0.99 0.00 0.07 70.24 09:04:01 2 26.78 0.00 1.09 0.02 0.08 72.03 09:04:01 3 24.28 0.00 1.05 0.03 0.08 74.55 09:05:01 all 37.71 0.00 1.01 0.29 0.09 60.91 09:05:01 0 38.91 0.00 1.37 0.00 0.08 59.63 09:05:01 1 39.75 0.00 0.92 0.23 0.08 59.01 09:05:01 2 39.47 0.00 0.97 0.00 0.10 59.46 09:05:01 3 32.75 0.00 0.76 0.93 0.08 65.47 09:06:01 all 8.62 0.00 0.29 0.01 0.06 91.01 09:06:01 0 8.74 0.00 0.33 0.00 0.07 90.86 09:06:01 1 8.36 0.00 0.22 0.00 0.05 91.37 09:06:01 2 9.10 0.00 0.25 0.00 0.05 90.60 09:06:01 3 8.28 0.00 0.35 0.05 0.08 91.23 09:07:01 all 4.47 0.00 0.20 0.01 0.05 95.26 09:07:01 0 4.61 0.00 0.18 0.00 0.05 95.16 09:07:01 1 4.17 0.00 0.25 0.00 0.05 95.53 09:07:01 2 4.23 0.00 0.15 0.00 0.05 95.57 09:07:01 3 4.87 0.00 0.22 0.05 0.07 94.80 09:08:01 all 2.19 0.00 0.18 0.01 0.06 97.57 09:08:01 0 2.18 0.00 0.28 0.00 0.07 97.47 09:08:01 1 2.32 0.00 0.12 0.00 0.05 97.51 09:08:01 2 2.27 0.00 0.18 0.00 0.08 97.46 09:08:01 3 1.98 0.00 0.13 0.03 0.03 97.82 09:09:01 all 2.91 0.00 0.21 0.02 0.05 96.82 09:09:01 0 2.80 0.00 0.23 0.00 0.03 96.93 09:09:01 1 2.93 0.00 0.18 0.03 0.05 96.80 09:09:01 2 3.29 0.00 0.29 0.00 0.07 96.35 09:09:01 3 2.62 0.00 0.12 0.03 0.05 97.18 09:10:01 all 1.13 0.00 0.15 0.01 0.06 98.65 09:10:01 0 0.85 0.00 0.20 0.00 0.07 98.88 09:10:01 1 1.27 0.00 0.17 0.02 0.07 98.48 09:10:01 2 1.37 0.00 0.18 0.00 0.07 98.38 09:10:01 3 1.01 0.00 0.07 0.02 0.05 98.86 09:11:01 all 3.19 0.00 0.19 0.01 0.06 96.55 09:11:01 0 3.83 0.00 0.18 0.00 0.07 95.92 09:11:01 1 3.77 0.00 0.20 0.02 0.05 95.97 09:11:01 2 3.00 0.00 0.20 0.00 0.07 96.74 09:11:01 3 2.16 0.00 0.18 0.02 0.05 97.59 09:12:01 all 1.03 0.00 0.16 0.01 0.05 98.74 09:12:01 0 0.86 0.00 0.15 0.00 0.07 98.93 09:12:01 1 0.77 0.00 0.17 0.00 0.05 99.01 09:12:01 2 0.80 0.00 0.12 0.00 0.05 99.03 09:12:01 3 1.70 0.00 0.20 0.05 0.05 98.01 09:13:01 all 3.61 0.00 0.23 0.02 0.05 96.09 09:13:01 0 3.07 0.00 0.25 0.00 0.05 96.63 09:13:01 1 4.47 0.00 0.23 0.02 0.07 95.21 09:13:01 2 3.49 0.00 0.25 0.00 0.05 96.20 09:13:01 3 3.41 0.00 0.18 0.05 0.03 96.32 09:13:01 CPU %user %nice %system %iowait %steal %idle 09:14:01 all 23.75 0.00 1.05 0.11 0.07 75.02 09:14:01 0 22.57 0.00 0.62 0.28 0.07 76.46 09:14:01 1 20.01 0.00 0.98 0.03 0.05 78.93 09:14:01 2 30.67 0.00 1.07 0.05 0.08 68.12 09:14:01 3 21.79 0.00 1.54 0.07 0.08 76.52 09:15:01 all 39.05 0.00 1.10 0.58 0.10 59.16 09:15:01 0 40.35 0.00 1.02 0.00 0.10 58.53 09:15:01 1 35.69 0.00 0.89 0.27 0.10 63.05 09:15:01 2 39.40 0.00 1.02 1.37 0.10 58.10 09:15:01 3 40.76 0.00 1.48 0.69 0.10 56.98 09:16:01 all 5.06 0.00 0.25 0.01 0.06 94.62 09:16:01 0 5.12 0.00 0.23 0.00 0.03 94.62 09:16:01 1 5.11 0.00 0.25 0.00 0.07 94.57 09:16:01 2 5.23 0.00 0.27 0.05 0.07 94.38 09:16:01 3 4.77 0.00 0.24 0.00 0.07 94.93 09:17:01 all 46.15 0.00 1.49 0.23 0.08 52.04 09:17:01 0 47.75 0.00 1.69 0.27 0.08 50.21 09:17:01 1 47.88 0.00 1.34 0.07 0.08 50.63 09:17:01 2 47.10 0.00 1.30 0.52 0.08 50.99 09:17:01 3 41.91 0.00 1.63 0.07 0.08 56.31 09:18:01 all 11.83 0.00 0.45 0.01 0.07 87.64 09:18:01 0 12.76 0.00 0.66 0.00 0.08 86.49 09:18:01 1 11.88 0.00 0.40 0.00 0.07 87.65 09:18:01 2 11.34 0.00 0.40 0.03 0.07 88.15 09:18:01 3 11.32 0.00 0.33 0.00 0.07 88.28 09:19:01 all 2.54 0.00 0.26 0.01 0.05 97.14 09:19:01 0 2.65 0.00 0.22 0.03 0.05 97.05 09:19:01 1 2.60 0.00 0.35 0.00 0.07 96.98 09:19:01 2 2.49 0.00 0.25 0.02 0.05 97.19 09:19:01 3 2.40 0.00 0.22 0.00 0.05 97.33 09:20:01 all 3.21 0.00 0.31 0.02 0.05 96.40 09:20:01 0 3.22 0.00 0.38 0.02 0.05 96.33 09:20:01 1 3.47 0.00 0.30 0.00 0.05 96.18 09:20:01 2 3.10 0.00 0.25 0.03 0.07 96.54 09:20:01 3 3.05 0.00 0.32 0.03 0.05 96.54 09:21:01 all 1.78 0.00 0.24 0.01 0.05 97.91 09:21:01 0 1.79 0.00 0.17 0.00 0.05 98.00 09:21:01 1 1.55 0.00 0.13 0.00 0.05 98.26 09:21:01 2 2.24 0.00 0.45 0.02 0.07 97.23 09:21:01 3 1.56 0.00 0.22 0.02 0.05 98.15 09:22:01 all 4.26 0.00 0.36 0.01 0.05 95.31 09:22:01 0 4.24 0.00 0.32 0.00 0.05 95.39 09:22:01 1 4.35 0.00 0.33 0.00 0.03 95.28 09:22:01 2 4.42 0.00 0.52 0.02 0.07 94.98 09:22:01 3 4.04 0.00 0.29 0.02 0.05 95.61 09:23:01 all 2.60 0.00 0.29 0.01 0.06 97.03 09:23:01 0 2.65 0.00 0.32 0.00 0.05 96.98 09:23:01 1 2.39 0.00 0.27 0.00 0.07 97.28 09:23:01 2 3.01 0.00 0.35 0.02 0.07 96.55 09:23:01 3 2.36 0.00 0.23 0.03 0.05 97.32 09:24:01 all 0.60 0.00 0.24 0.01 0.05 99.10 09:24:01 0 0.57 0.00 0.22 0.02 0.05 99.15 09:24:01 1 0.87 0.00 0.44 0.00 0.07 98.63 09:24:01 2 0.44 0.00 0.20 0.03 0.05 99.28 09:24:01 3 0.53 0.00 0.10 0.00 0.03 99.33 09:24:01 CPU %user %nice %system %iowait %steal %idle 09:25:01 all 0.38 0.00 0.18 0.01 0.05 99.38 09:25:01 0 0.52 0.00 0.27 0.00 0.05 99.17 09:25:01 1 0.40 0.00 0.23 0.00 0.05 99.31 09:25:01 2 0.22 0.00 0.15 0.02 0.05 99.56 09:25:01 3 0.40 0.00 0.08 0.02 0.03 99.47 09:26:01 all 0.38 0.00 0.23 0.00 0.05 99.33 09:26:01 0 0.35 0.00 0.27 0.00 0.07 99.32 09:26:01 1 0.48 0.00 0.30 0.00 0.07 99.15 09:26:01 2 0.27 0.00 0.13 0.02 0.05 99.53 09:26:01 3 0.42 0.00 0.22 0.00 0.03 99.33 09:27:01 all 40.66 0.00 1.37 0.02 0.08 57.86 09:27:01 0 42.55 0.00 1.01 0.02 0.08 56.34 09:27:01 1 39.34 0.00 1.29 0.02 0.08 59.27 09:27:01 2 37.55 0.00 1.37 0.05 0.08 60.94 09:27:01 3 43.22 0.00 1.81 0.00 0.08 54.89 09:28:01 all 23.92 0.00 0.59 0.28 0.08 75.12 09:28:01 0 23.75 0.00 0.52 0.02 0.07 75.64 09:28:01 1 23.99 0.00 0.43 0.00 0.08 75.49 09:28:01 2 23.61 0.00 0.65 1.05 0.10 74.59 09:28:01 3 24.34 0.00 0.77 0.07 0.08 74.74 09:29:01 all 8.68 0.00 0.46 0.08 0.06 90.72 09:29:01 0 8.06 0.00 0.47 0.02 0.07 91.39 09:29:01 1 8.25 0.00 0.45 0.00 0.08 91.21 09:29:01 2 8.81 0.00 0.44 0.13 0.05 90.57 09:29:01 3 9.62 0.00 0.49 0.15 0.05 89.69 09:30:01 all 5.80 0.00 0.33 0.02 0.06 93.79 09:30:01 0 5.92 0.00 0.39 0.05 0.05 93.59 09:30:01 1 5.76 0.00 0.44 0.00 0.07 93.73 09:30:01 2 5.35 0.00 0.23 0.03 0.07 94.32 09:30:01 3 6.16 0.00 0.27 0.00 0.07 93.51 09:31:01 all 2.46 0.00 0.29 0.01 0.05 97.19 09:31:01 0 2.46 0.00 0.37 0.02 0.07 97.08 09:31:01 1 2.50 0.00 0.32 0.00 0.05 97.14 09:31:01 2 2.58 0.00 0.27 0.02 0.03 97.10 09:31:01 3 2.31 0.00 0.22 0.00 0.03 97.44 09:32:01 all 0.28 0.00 0.15 0.01 0.05 99.51 09:32:01 0 0.49 0.00 0.27 0.02 0.07 99.16 09:32:01 1 0.25 0.00 0.12 0.00 0.07 99.56 09:32:01 2 0.19 0.00 0.13 0.03 0.05 99.60 09:32:01 3 0.20 0.00 0.07 0.00 0.03 99.70 09:33:01 all 0.29 0.00 0.13 0.01 0.05 99.51 09:33:01 0 0.39 0.00 0.13 0.02 0.05 99.41 09:33:01 1 0.27 0.00 0.07 0.00 0.03 99.63 09:33:01 2 0.23 0.00 0.13 0.02 0.05 99.56 09:33:01 3 0.28 0.00 0.20 0.00 0.07 99.45 09:34:01 all 14.27 0.00 0.80 0.09 0.06 84.77 09:34:01 0 16.25 0.00 0.54 0.05 0.07 83.10 09:34:01 1 12.43 0.00 0.87 0.00 0.07 86.64 09:34:01 2 12.76 0.00 1.04 0.27 0.05 85.88 09:34:01 3 15.65 0.00 0.75 0.05 0.07 83.48 Average: all 17.02 0.24 0.90 1.07 0.07 80.70 Average: 0 17.18 0.24 0.92 1.18 0.07 80.40 Average: 1 17.13 0.24 0.92 0.94 0.07 80.70 Average: 2 16.89 0.25 0.87 1.09 0.07 80.83 Average: 3 16.89 0.22 0.89 1.06 0.07 80.87