Triggered by Gerrit: https://git.opendaylight.org/gerrit/c/transportpce/+/113593 Running as SYSTEM [EnvInject] - Loading node environment variables. Building remotely on prd-ubuntu2004-docker-4c-16g-24687 (ubuntu2004-docker-4c-16g) in workspace /w/workspace/transportpce-tox-verify-scandium [ssh-agent] Looking for ssh-agent implementation... [ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine) $ ssh-agent SSH_AUTH_SOCK=/tmp/ssh-myPly27TpxZx/agent.13748 SSH_AGENT_PID=13753 [ssh-agent] Started. Running ssh-add (command line suppressed) Identity added: /w/workspace/transportpce-tox-verify-scandium@tmp/private_key_1782884005649357238.key (/w/workspace/transportpce-tox-verify-scandium@tmp/private_key_1782884005649357238.key) [ssh-agent] Using credentials jenkins (jenkins-ssh) The recommended git tool is: NONE using credential jenkins-ssh Wiping out workspace first. Cloning the remote Git repository Cloning repository git://devvexx.opendaylight.org/mirror/transportpce > git init /w/workspace/transportpce-tox-verify-scandium # timeout=10 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/transportpce > git --version # timeout=10 > git --version # 'git version 2.25.1' using GIT_SSH to set credentials jenkins-ssh Verifying host key using known hosts file You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/transportpce +refs/heads/*:refs/remotes/origin/* # timeout=10 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/transportpce # timeout=10 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/transportpce # timeout=10 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/transportpce using GIT_SSH to set credentials jenkins-ssh Verifying host key using known hosts file You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/transportpce refs/changes/93/113593/4 # timeout=10 > git rev-parse 25e11989b58d618d97414a24deb1347ba1abc808^{commit} # timeout=10 Checking out Revision 25e11989b58d618d97414a24deb1347ba1abc808 (refs/changes/93/113593/4) > git config core.sparsecheckout # timeout=10 > git checkout -f 25e11989b58d618d97414a24deb1347ba1abc808 # timeout=10 Commit message: "Bump netconf to 8.0.2" > git rev-parse FETCH_HEAD^{commit} # timeout=10 > git rev-list --no-walk 0369165863367431bc63630f29759e6299e46ef7 # timeout=10 > git remote # timeout=10 > git submodule init # timeout=10 > git submodule sync # timeout=10 > git config --get remote.origin.url # timeout=10 > git submodule init # timeout=10 > git config -f .gitmodules --get-regexp ^submodule\.(.+)\.url # timeout=10 ERROR: No submodules found. provisioning config files... copy managed file [npmrc] to file:/home/jenkins/.npmrc copy managed file [pipconf] to file:/home/jenkins/.config/pip/pip.conf [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins12906811165339661670.sh ---> python-tools-install.sh Setup pyenv: * system (set by /opt/pyenv/version) * 3.8.13 (set by /opt/pyenv/version) * 3.9.13 (set by /opt/pyenv/version) * 3.10.13 (set by /opt/pyenv/version) * 3.11.7 (set by /opt/pyenv/version) lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-LJis lf-activate-venv(): INFO: Save venv in file: /tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools lf-activate-venv(): INFO: Adding /tmp/venv-LJis/bin to PATH Generating Requirements File Python 3.11.7 pip 24.2 from /tmp/venv-LJis/lib/python3.11/site-packages/pip (python 3.11) appdirs==1.4.4 argcomplete==3.5.0 aspy.yaml==1.3.0 attrs==24.2.0 autopage==0.5.2 beautifulsoup4==4.12.3 boto3==1.35.24 botocore==1.35.24 bs4==0.0.2 cachetools==5.5.0 certifi==2024.8.30 cffi==1.17.1 cfgv==3.4.0 chardet==5.2.0 charset-normalizer==3.3.2 click==8.1.7 cliff==4.7.0 cmd2==2.4.3 cryptography==3.3.2 debtcollector==3.0.0 decorator==5.1.1 defusedxml==0.7.1 Deprecated==1.2.14 distlib==0.3.8 dnspython==2.6.1 docker==4.2.2 dogpile.cache==1.3.3 durationpy==0.7 email_validator==2.2.0 filelock==3.16.1 future==1.0.0 gitdb==4.0.11 GitPython==3.1.43 google-auth==2.35.0 httplib2==0.22.0 identify==2.6.1 idna==3.10 importlib-resources==1.5.0 iso8601==2.1.0 Jinja2==3.1.4 jmespath==1.0.1 jsonpatch==1.33 jsonpointer==3.0.0 jsonschema==4.23.0 jsonschema-specifications==2023.12.1 keystoneauth1==5.8.0 kubernetes==31.0.0 lftools==0.37.10 lxml==5.3.0 MarkupSafe==2.1.5 msgpack==1.1.0 multi_key_dict==2.0.3 munch==4.0.0 netaddr==1.3.0 netifaces==0.11.0 niet==1.4.2 nodeenv==1.9.1 oauth2client==4.1.3 oauthlib==3.2.2 openstacksdk==4.0.0 os-client-config==2.1.0 os-service-types==1.7.0 osc-lib==3.1.0 oslo.config==9.6.0 oslo.context==5.6.0 oslo.i18n==6.4.0 oslo.log==6.1.2 oslo.serialization==5.5.0 oslo.utils==7.3.0 packaging==24.1 pbr==6.1.0 platformdirs==4.3.6 prettytable==3.11.0 pyasn1==0.6.1 pyasn1_modules==0.4.1 pycparser==2.22 pygerrit2==2.0.15 PyGithub==2.4.0 PyJWT==2.9.0 PyNaCl==1.5.0 pyparsing==2.4.7 pyperclip==1.9.0 pyrsistent==0.20.0 python-cinderclient==9.6.0 python-dateutil==2.9.0.post0 python-heatclient==4.0.0 python-jenkins==1.8.2 python-keystoneclient==5.5.0 python-magnumclient==4.7.0 python-openstackclient==7.1.2 python-swiftclient==4.6.0 PyYAML==6.0.2 referencing==0.35.1 requests==2.32.3 requests-oauthlib==2.0.0 requestsexceptions==1.4.0 rfc3986==2.0.0 rpds-py==0.20.0 rsa==4.9 ruamel.yaml==0.18.6 ruamel.yaml.clib==0.2.8 s3transfer==0.10.2 simplejson==3.19.3 six==1.16.0 smmap==5.0.1 soupsieve==2.6 stevedore==5.3.0 tabulate==0.9.0 toml==0.10.2 tomlkit==0.13.2 tqdm==4.66.5 typing_extensions==4.12.2 tzdata==2024.1 urllib3==1.26.20 virtualenv==20.26.5 wcwidth==0.2.13 websocket-client==1.8.0 wrapt==1.16.0 xdg==6.0.0 xmltodict==0.13.0 yq==3.4.3 [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content PYTHON=python3 [EnvInject] - Variables injected successfully. [transportpce-tox-verify-scandium] $ /bin/bash -l /tmp/jenkins17097217420485241539.sh ---> tox-install.sh + source /home/jenkins/lf-env.sh + lf-activate-venv --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 ++ mktemp -d /tmp/venv-XXXX + lf_venv=/tmp/venv-1U55 + local venv_file=/tmp/.os_lf_venv + local python=python3 + local options + local set_path=true + local install_args= ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 + options=' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' + eval set -- ' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' ++ set -- --venv-file /tmp/.toxenv -- tox virtualenv urllib3~=1.26.15 + true + case $1 in + venv_file=/tmp/.toxenv + shift 2 + true + case $1 in + shift + break + case $python in + local pkg_list= + [[ -d /opt/pyenv ]] + echo 'Setup pyenv:' Setup pyenv: + export PYENV_ROOT=/opt/pyenv + PYENV_ROOT=/opt/pyenv + export PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + pyenv versions system 3.8.13 3.9.13 3.10.13 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-scandium/.python-version) + command -v pyenv ++ pyenv init - --no-rehash + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); for i in ${!paths[@]}; do if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; fi; done; echo "${paths[*]}"'\'')" export PATH="/opt/pyenv/shims:${PATH}" export PYENV_SHELL=bash source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' pyenv() { local command command="${1:-}" if [ "$#" -gt 0 ]; then shift fi case "$command" in rehash|shell) eval "$(pyenv "sh-$command" "$@")" ;; *) command pyenv "$command" "$@" ;; esac }' +++ bash --norc -ec 'IFS=:; paths=($PATH); for i in ${!paths[@]}; do if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; fi; done; echo "${paths[*]}"' ++ PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ export PYENV_SHELL=bash ++ PYENV_SHELL=bash ++ source /opt/pyenv/libexec/../completions/pyenv.bash +++ complete -F _pyenv pyenv ++ lf-pyver python3 ++ local py_version_xy=python3 ++ local py_version_xyz= ++ pyenv versions ++ local command ++ command=versions ++ '[' 1 -gt 0 ']' ++ shift ++ case "$command" in ++ command pyenv versions ++ pyenv versions ++ sed 's/^[ *]* //' ++ awk '{ print $1 }' ++ grep -E '^[0-9.]*[0-9]$' ++ [[ ! -s /tmp/.pyenv_versions ]] +++ grep '^3' /tmp/.pyenv_versions +++ sort -V +++ tail -n 1 ++ py_version_xyz=3.11.7 ++ [[ -z 3.11.7 ]] ++ echo 3.11.7 ++ return 0 + pyenv local 3.11.7 + local command + command=local + '[' 2 -gt 0 ']' + shift + case "$command" in + command pyenv local 3.11.7 + pyenv local 3.11.7 + for arg in "$@" + case $arg in + pkg_list+='tox ' + for arg in "$@" + case $arg in + pkg_list+='virtualenv ' + for arg in "$@" + case $arg in + pkg_list+='urllib3~=1.26.15 ' + [[ -f /tmp/.toxenv ]] + [[ ! -f /tmp/.toxenv ]] + [[ -n '' ]] + python3 -m venv /tmp/venv-1U55 + echo 'lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-1U55' lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-1U55 + echo /tmp/venv-1U55 + echo 'lf-activate-venv(): INFO: Save venv in file: /tmp/.toxenv' lf-activate-venv(): INFO: Save venv in file: /tmp/.toxenv + /tmp/venv-1U55/bin/python3 -m pip install --upgrade --quiet pip virtualenv + [[ -z tox virtualenv urllib3~=1.26.15 ]] + echo 'lf-activate-venv(): INFO: Installing: tox virtualenv urllib3~=1.26.15 ' lf-activate-venv(): INFO: Installing: tox virtualenv urllib3~=1.26.15 + /tmp/venv-1U55/bin/python3 -m pip install --upgrade --quiet --upgrade-strategy eager tox virtualenv urllib3~=1.26.15 + type python3 + true + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-1U55/bin to PATH' lf-activate-venv(): INFO: Adding /tmp/venv-1U55/bin to PATH + PATH=/tmp/venv-1U55/bin:/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + return 0 + python3 --version Python 3.11.7 + python3 -m pip --version pip 24.2 from /tmp/venv-1U55/lib/python3.11/site-packages/pip (python 3.11) + python3 -m pip freeze cachetools==5.5.0 chardet==5.2.0 colorama==0.4.6 distlib==0.3.8 filelock==3.16.1 packaging==24.1 platformdirs==4.3.6 pluggy==1.5.0 pyproject-api==1.8.0 tox==4.20.0 urllib3==1.26.20 virtualenv==20.26.5 [transportpce-tox-verify-scandium] $ /bin/sh -xe /tmp/jenkins6810618720613523077.sh [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content PARALLEL=True [EnvInject] - Variables injected successfully. [transportpce-tox-verify-scandium] $ /bin/bash -l /tmp/jenkins16994029898501692305.sh ---> tox-run.sh + PATH=/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + ARCHIVE_TOX_DIR=/w/workspace/transportpce-tox-verify-scandium/archives/tox + ARCHIVE_DOC_DIR=/w/workspace/transportpce-tox-verify-scandium/archives/docs + mkdir -p /w/workspace/transportpce-tox-verify-scandium/archives/tox + cd /w/workspace/transportpce-tox-verify-scandium/. + source /home/jenkins/lf-env.sh + lf-activate-venv --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 ++ mktemp -d /tmp/venv-XXXX + lf_venv=/tmp/venv-FSd0 + local venv_file=/tmp/.os_lf_venv + local python=python3 + local options + local set_path=true + local install_args= ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 + options=' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' + eval set -- ' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' ++ set -- --venv-file /tmp/.toxenv -- tox virtualenv urllib3~=1.26.15 + true + case $1 in + venv_file=/tmp/.toxenv + shift 2 + true + case $1 in + shift + break + case $python in + local pkg_list= + [[ -d /opt/pyenv ]] + echo 'Setup pyenv:' Setup pyenv: + export PYENV_ROOT=/opt/pyenv + PYENV_ROOT=/opt/pyenv + export PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + pyenv versions system 3.8.13 3.9.13 3.10.13 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-scandium/.python-version) + command -v pyenv ++ pyenv init - --no-rehash + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); for i in ${!paths[@]}; do if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; fi; done; echo "${paths[*]}"'\'')" export PATH="/opt/pyenv/shims:${PATH}" export PYENV_SHELL=bash source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' pyenv() { local command command="${1:-}" if [ "$#" -gt 0 ]; then shift fi case "$command" in rehash|shell) eval "$(pyenv "sh-$command" "$@")" ;; *) command pyenv "$command" "$@" ;; esac }' +++ bash --norc -ec 'IFS=:; paths=($PATH); for i in ${!paths[@]}; do if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; fi; done; echo "${paths[*]}"' ++ PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ export PYENV_SHELL=bash ++ PYENV_SHELL=bash ++ source /opt/pyenv/libexec/../completions/pyenv.bash +++ complete -F _pyenv pyenv ++ lf-pyver python3 ++ local py_version_xy=python3 ++ local py_version_xyz= ++ pyenv versions ++ local command ++ command=versions ++ '[' 1 -gt 0 ']' ++ shift ++ case "$command" in ++ command pyenv versions ++ pyenv versions ++ awk '{ print $1 }' ++ sed 's/^[ *]* //' ++ grep -E '^[0-9.]*[0-9]$' ++ [[ ! -s /tmp/.pyenv_versions ]] +++ grep '^3' /tmp/.pyenv_versions +++ sort -V +++ tail -n 1 ++ py_version_xyz=3.11.7 ++ [[ -z 3.11.7 ]] ++ echo 3.11.7 ++ return 0 + pyenv local 3.11.7 + local command + command=local + '[' 2 -gt 0 ']' + shift + case "$command" in + command pyenv local 3.11.7 + pyenv local 3.11.7 + for arg in "$@" + case $arg in + pkg_list+='tox ' + for arg in "$@" + case $arg in + pkg_list+='virtualenv ' + for arg in "$@" + case $arg in + pkg_list+='urllib3~=1.26.15 ' + [[ -f /tmp/.toxenv ]] ++ cat /tmp/.toxenv + lf_venv=/tmp/venv-1U55 + echo 'lf-activate-venv(): INFO: Reuse venv:/tmp/venv-1U55 from' file:/tmp/.toxenv lf-activate-venv(): INFO: Reuse venv:/tmp/venv-1U55 from file:/tmp/.toxenv + /tmp/venv-1U55/bin/python3 -m pip install --upgrade --quiet pip virtualenv + [[ -z tox virtualenv urllib3~=1.26.15 ]] + echo 'lf-activate-venv(): INFO: Installing: tox virtualenv urllib3~=1.26.15 ' lf-activate-venv(): INFO: Installing: tox virtualenv urllib3~=1.26.15 + /tmp/venv-1U55/bin/python3 -m pip install --upgrade --quiet --upgrade-strategy eager tox virtualenv urllib3~=1.26.15 + type python3 + true + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-1U55/bin to PATH' lf-activate-venv(): INFO: Adding /tmp/venv-1U55/bin to PATH + PATH=/tmp/venv-1U55/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + return 0 + [[ -d /opt/pyenv ]] + echo '---> Setting up pyenv' ---> Setting up pyenv + export PYENV_ROOT=/opt/pyenv + PYENV_ROOT=/opt/pyenv + export PATH=/opt/pyenv/bin:/tmp/venv-1U55/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + PATH=/opt/pyenv/bin:/tmp/venv-1U55/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ pwd + PYTHONPATH=/w/workspace/transportpce-tox-verify-scandium + export PYTHONPATH + export TOX_TESTENV_PASSENV=PYTHONPATH + TOX_TESTENV_PASSENV=PYTHONPATH + tox --version 4.20.0 from /tmp/venv-1U55/lib/python3.11/site-packages/tox/__init__.py + PARALLEL=True + TOX_OPTIONS_LIST= + [[ -n '' ]] + case ${PARALLEL,,} in + TOX_OPTIONS_LIST=' --parallel auto --parallel-live' + tox --parallel auto --parallel-live + tee -a /w/workspace/transportpce-tox-verify-scandium/archives/tox/tox.log docs: install_deps> python -I -m pip install -r docs/requirements.txt buildcontroller: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt checkbashisms: freeze> python -m pip freeze --all docs-linkcheck: install_deps> python -I -m pip install -r docs/requirements.txt checkbashisms: pip==24.2,setuptools==75.1.0,wheel==0.44.0 checkbashisms: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./fixCIcentOS8reposMirrors.sh checkbashisms: commands[1] /w/workspace/transportpce-tox-verify-scandium/tests> sh -c 'command checkbashisms>/dev/null || sudo yum install -y devscripts-checkbashisms || sudo yum install -y devscripts-minimal || sudo yum install -y devscripts || sudo yum install -y https://archives.fedoraproject.org/pub/archive/fedora/linux/releases/31/Everything/x86_64/os/Packages/d/devscripts-checkbashisms-2.19.6-2.fc31.x86_64.rpm || (echo "checkbashisms command not found - please install it (e.g. sudo apt-get install devscripts | yum install devscripts-minimal )" >&2 && exit 1)' checkbashisms: commands[2] /w/workspace/transportpce-tox-verify-scandium/tests> find . -not -path '*/\.*' -name '*.sh' -exec checkbashisms -f '{}' + script ./reflectwarn.sh does not appear to have a #! interpreter line; you may get strange results checkbashisms: OK ✔ in 2.89 seconds pre-commit: install_deps> python -I -m pip install pre-commit pre-commit: freeze> python -m pip freeze --all pre-commit: cfgv==3.4.0,distlib==0.3.8,filelock==3.16.1,identify==2.6.1,nodeenv==1.9.1,pip==24.2,platformdirs==4.3.6,pre-commit==3.8.0,PyYAML==6.0.2,setuptools==75.1.0,virtualenv==20.26.5,wheel==0.44.0 pre-commit: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./fixCIcentOS8reposMirrors.sh pre-commit: commands[1] /w/workspace/transportpce-tox-verify-scandium/tests> sh -c 'which cpan || sudo yum install -y perl-CPAN || (echo "cpan command not found - please install it (e.g. sudo apt-get install perl-modules | yum install perl-CPAN )" >&2 && exit 1)' /usr/bin/cpan pre-commit: commands[2] /w/workspace/transportpce-tox-verify-scandium/tests> pre-commit run --all-files --show-diff-on-failure [INFO] Initializing environment for https://github.com/pre-commit/pre-commit-hooks. [INFO] Initializing environment for https://github.com/jorisroovers/gitlint. [INFO] Initializing environment for https://github.com/jorisroovers/gitlint:./gitlint-core[trusted-deps]. [INFO] Initializing environment for https://github.com/Lucas-C/pre-commit-hooks. buildcontroller: freeze> python -m pip freeze --all [INFO] Initializing environment for https://github.com/pre-commit/mirrors-autopep8. buildcontroller: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 buildcontroller: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./build_controller.sh + update-java-alternatives -l java-1.11.0-openjdk-amd64 1111 /usr/lib/jvm/java-1.11.0-openjdk-amd64 java-1.12.0-openjdk-amd64 1211 /usr/lib/jvm/java-1.12.0-openjdk-amd64 java-1.17.0-openjdk-amd64 1711 /usr/lib/jvm/java-1.17.0-openjdk-amd64 java-1.21.0-openjdk-amd64 2111 /usr/lib/jvm/java-1.21.0-openjdk-amd64 + sudo update-java-alternatives -s java-1.21.0-openjdk-amd64 java-1.8.0-openjdk-amd64 1081 /usr/lib/jvm/java-1.8.0-openjdk-amd64 [INFO] Initializing environment for https://github.com/perltidy/perltidy. + + sed -n ;s/.* version "\(.*\)\.\(.*\)\..*".*$/\1/p; java -version + JAVA_VER=21 + echo 21 21 + sed -n ;s/javac \(.*\)\.\(.*\)\..*.*$/\1/p; + javac -version [INFO] Installing environment for https://github.com/pre-commit/pre-commit-hooks. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... + JAVAC_VER=21 + echo 21 21 ok, java is 21 or newer + [ 21 -ge 21 ] + [ 21 -ge 21 ] + echo ok, java is 21 or newer + wget -nv https://dlcdn.apache.org/maven/maven-3/3.9.8/binaries/apache-maven-3.9.8-bin.tar.gz -P /tmp 2024-09-20 20:34:46 URL:https://dlcdn.apache.org/maven/maven-3/3.9.8/binaries/apache-maven-3.9.8-bin.tar.gz [9083702/9083702] -> "/tmp/apache-maven-3.9.8-bin.tar.gz" [1] + sudo mkdir -p /opt + sudo tar xf /tmp/apache-maven-3.9.8-bin.tar.gz -C /opt + sudo ln -s /opt/apache-maven-3.9.8 /opt/maven + sudo ln -s /opt/maven/bin/mvn /usr/bin/mvn + mvn --version Apache Maven 3.9.8 (36645f6c9b5079805ea5009217e36f2cffd34256) Maven home: /opt/maven Java version: 21.0.4, vendor: Ubuntu, runtime: /usr/lib/jvm/java-21-openjdk-amd64 Default locale: en, platform encoding: UTF-8 OS name: "linux", version: "5.4.0-190-generic", arch: "amd64", family: "unix" NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.nio.charset=ALL-UNNAMED --add-opens=java.base/java.nio.file=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.jar=ALL-UNNAMED --add-opens=java.base/java.util.stream=ALL-UNNAMED --add-opens=java.base/java.util.zip=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/sun.nio.fs=ALL-UNNAMED -Xlog:disable [INFO] Installing environment for https://github.com/Lucas-C/pre-commit-hooks. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... [INFO] Installing environment for https://github.com/pre-commit/mirrors-autopep8. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... [INFO] Installing environment for https://github.com/perltidy/perltidy. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... docs: freeze> python -m pip freeze --all docs: alabaster==0.7.16,attrs==24.2.0,babel==2.16.0,blockdiag==3.0.0,certifi==2024.8.30,charset-normalizer==3.3.2,contourpy==1.3.0,cycler==0.12.1,docutils==0.20.1,fonttools==4.53.1,funcparserlib==2.0.0a0,future==1.0.0,idna==3.10,imagesize==1.4.1,Jinja2==3.1.4,jsonschema==3.2.0,kiwisolver==1.4.7,lfdocs-conf==0.9.0,MarkupSafe==2.1.5,matplotlib==3.9.2,numpy==2.1.1,nwdiag==3.0.0,packaging==24.1,pillow==10.4.0,pip==24.2,Pygments==2.18.0,pyparsing==3.1.4,pyrsistent==0.20.0,python-dateutil==2.9.0.post0,PyYAML==6.0.2,requests==2.32.3,requests-file==1.5.1,seqdiag==3.0.0,setuptools==75.1.0,six==1.16.0,snowballstemmer==2.2.0,Sphinx==7.4.7,sphinx-bootstrap-theme==0.8.1,sphinx-data-viewer==0.1.5,sphinx-rtd-theme==2.0.0,sphinx-tabs==3.4.5,sphinxcontrib-applehelp==2.0.0,sphinxcontrib-blockdiag==3.0.0,sphinxcontrib-devhelp==2.0.0,sphinxcontrib-htmlhelp==2.1.0,sphinxcontrib-jquery==4.1,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-needs==0.7.9,sphinxcontrib-nwdiag==2.0.0,sphinxcontrib-plantuml==0.30,sphinxcontrib-qthelp==2.0.0,sphinxcontrib-seqdiag==3.0.0,sphinxcontrib-serializinghtml==2.0.0,sphinxcontrib-swaggerdoc==0.1.7,urllib3==2.2.3,webcolors==24.8.0,wheel==0.44.0 docs: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> sphinx-build -q -W --keep-going -b html -n -d /w/workspace/transportpce-tox-verify-scandium/.tox/docs/tmp/doctrees ../docs/ /w/workspace/transportpce-tox-verify-scandium/docs/_build/html docs-linkcheck: freeze> python -m pip freeze --all docs-linkcheck: alabaster==0.7.16,attrs==24.2.0,babel==2.16.0,blockdiag==3.0.0,certifi==2024.8.30,charset-normalizer==3.3.2,contourpy==1.3.0,cycler==0.12.1,docutils==0.20.1,fonttools==4.53.1,funcparserlib==2.0.0a0,future==1.0.0,idna==3.10,imagesize==1.4.1,Jinja2==3.1.4,jsonschema==3.2.0,kiwisolver==1.4.7,lfdocs-conf==0.9.0,MarkupSafe==2.1.5,matplotlib==3.9.2,numpy==2.1.1,nwdiag==3.0.0,packaging==24.1,pillow==10.4.0,pip==24.2,Pygments==2.18.0,pyparsing==3.1.4,pyrsistent==0.20.0,python-dateutil==2.9.0.post0,PyYAML==6.0.2,requests==2.32.3,requests-file==1.5.1,seqdiag==3.0.0,setuptools==75.1.0,six==1.16.0,snowballstemmer==2.2.0,Sphinx==7.4.7,sphinx-bootstrap-theme==0.8.1,sphinx-data-viewer==0.1.5,sphinx-rtd-theme==2.0.0,sphinx-tabs==3.4.5,sphinxcontrib-applehelp==2.0.0,sphinxcontrib-blockdiag==3.0.0,sphinxcontrib-devhelp==2.0.0,sphinxcontrib-htmlhelp==2.1.0,sphinxcontrib-jquery==4.1,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-needs==0.7.9,sphinxcontrib-nwdiag==2.0.0,sphinxcontrib-plantuml==0.30,sphinxcontrib-qthelp==2.0.0,sphinxcontrib-seqdiag==3.0.0,sphinxcontrib-serializinghtml==2.0.0,sphinxcontrib-swaggerdoc==0.1.7,urllib3==2.2.3,webcolors==24.8.0,wheel==0.44.0 docs-linkcheck: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> sphinx-build -q -b linkcheck -d /w/workspace/transportpce-tox-verify-scandium/.tox/docs-linkcheck/tmp/doctrees ../docs/ /w/workspace/transportpce-tox-verify-scandium/docs/_build/linkcheck /w/workspace/transportpce-tox-verify-scandium/.tox/docs-linkcheck/lib/python3.11/site-packages/sphinx/builders/linkcheck.py:86: RemovedInSphinx80Warning: The default value for 'linkcheck_report_timeouts_as_broken' will change to False in Sphinx 8, meaning that request timeouts will be reported with a new 'timeout' status, instead of as 'broken'. This is intended to provide more detail as to the failure mode. See https://github.com/sphinx-doc/sphinx/issues/11868 for details. warnings.warn(deprecation_msg, RemovedInSphinx80Warning, stacklevel=1) docs: OK ✔ in 33.07 seconds pylint: install_deps> python -I -m pip install 'pylint>=2.6.0' trim trailing whitespace.................................................Passed Tabs remover.............................................................Passed autopep8.................................................................docs-linkcheck: OK ✔ in 35.3 seconds pylint: freeze> python -m pip freeze --all pylint: astroid==3.3.3,dill==0.3.8,isort==5.13.2,mccabe==0.7.0,pip==24.2,platformdirs==4.3.6,pylint==3.3.0,setuptools==75.1.0,tomlkit==0.13.2,wheel==0.44.0 pylint: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> find transportpce_tests/ -name '*.py' -exec pylint --fail-under=10 --max-line-length=120 --disable=missing-docstring,import-error --disable=fixme --disable=duplicate-code '--module-rgx=([a-z0-9_]+$)|([0-9.]{1,30}$)' '--method-rgx=(([a-z_][a-zA-Z0-9_]{2,})|(_[a-z0-9_]*)|(__[a-zA-Z][a-zA-Z0-9_]+__))$' '--variable-rgx=[a-zA-Z_][a-zA-Z0-9_]{1,30}$' '{}' + Passed perltidy.................................................................Passed pre-commit: commands[3] /w/workspace/transportpce-tox-verify-scandium/tests> pre-commit run gitlint-ci --hook-stage manual [INFO] Installing environment for https://github.com/jorisroovers/gitlint. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... gitlint..................................................................Passed ------------------------------------ Your code has been rated at 10.00/10 pre-commit: OK ✔ in 44.69 seconds pylint: OK ✔ in 27.53 seconds buildcontroller: OK ✔ in 1 minute 47.52 seconds build_karaf_tests121: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt build_karaf_tests221: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt sims: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt testsPCE: install_deps> python -I -m pip install gnpy4tpce==2.4.7 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt sims: freeze> python -m pip freeze --all build_karaf_tests221: freeze> python -m pip freeze --all build_karaf_tests121: freeze> python -m pip freeze --all sims: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 sims: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./install_lightynode.sh Using lighynode version 20.1.0.2 Installing lightynode device to ./lightynode/lightynode-openroadm-device directory build_karaf_tests221: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 build_karaf_tests221: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./build_karaf_for_tests.sh NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.nio.charset=ALL-UNNAMED --add-opens=java.base/java.nio.file=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.jar=ALL-UNNAMED --add-opens=java.base/java.util.stream=ALL-UNNAMED --add-opens=java.base/java.util.zip=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/sun.nio.fs=ALL-UNNAMED -Xlog:disable build_karaf_tests121: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 build_karaf_tests121: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./build_karaf_for_tests.sh NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.nio.charset=ALL-UNNAMED --add-opens=java.base/java.nio.file=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.jar=ALL-UNNAMED --add-opens=java.base/java.util.stream=ALL-UNNAMED --add-opens=java.base/java.util.zip=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/sun.nio.fs=ALL-UNNAMED -Xlog:disable sims: OK ✔ in 9.91 seconds build_karaf_tests71: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt build_karaf_tests71: freeze> python -m pip freeze --all build_karaf_tests71: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 build_karaf_tests71: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./build_karaf_for_tests.sh NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.nio.charset=ALL-UNNAMED --add-opens=java.base/java.nio.file=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.jar=ALL-UNNAMED --add-opens=java.base/java.util.stream=ALL-UNNAMED --add-opens=java.base/java.util.zip=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/sun.nio.fs=ALL-UNNAMED -Xlog:disable build_karaf_tests221: OK ✔ in 54.67 seconds build_karaf_tests_hybrid: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt build_karaf_tests121: OK ✔ in 55.75 seconds tests_tapi: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt build_karaf_tests71: OK ✔ in 51.96 seconds build_karaf_tests_hybrid: freeze> python -m pip freeze --all tests_tapi: freeze> python -m pip freeze --all build_karaf_tests_hybrid: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 build_karaf_tests_hybrid: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./build_karaf_for_tests.sh NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.nio.charset=ALL-UNNAMED --add-opens=java.base/java.nio.file=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.jar=ALL-UNNAMED --add-opens=java.base/java.util.stream=ALL-UNNAMED --add-opens=java.base/java.util.zip=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/sun.nio.fs=ALL-UNNAMED -Xlog:disable tests_tapi: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 tests_tapi: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh tapi using environment variables from ./karaf221.env pytest -q transportpce_tests/tapi/test01_abstracted_topology.py testsPCE: freeze> python -m pip freeze --all testsPCE: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,click==8.1.7,contourpy==1.3.0,cryptography==3.3.2,cycler==0.12.1,dict2xml==1.7.6,Flask==2.1.3,Flask-Injector==0.14.0,fonttools==4.53.1,gnpy4tpce==2.4.7,idna==3.10,iniconfig==2.0.0,injector==0.22.0,itsdangerous==2.2.0,Jinja2==3.1.4,kiwisolver==1.4.7,lxml==5.3.0,MarkupSafe==2.1.5,matplotlib==3.9.2,netconf-client==3.1.1,networkx==2.8.8,numpy==1.26.4,packaging==24.1,pandas==1.5.3,paramiko==3.5.0,pbr==5.11.1,pillow==10.4.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pyparsing==3.1.4,pytest==8.3.3,python-dateutil==2.9.0.post0,pytz==2024.2,requests==2.32.3,scipy==1.14.1,setuptools==50.3.2,six==1.16.0,urllib3==2.2.3,Werkzeug==2.0.3,wheel==0.44.0,xlrd==1.2.0 testsPCE: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh pce pytest -q transportpce_tests/pce/test01_pce.py ........................................ [100%] 20 passed in 120.29s (0:02:00) pytest -q transportpce_tests/pce/test02_pce_400G.py ..................... [100%] 9 passed in 42.88s pytest -q transportpce_tests/pce/test03_gnpy.py .............. [100%] 8 passed in 38.65s pytest -q transportpce_tests/pce/test04_pce_bug_fix.py ............. [100%] 3 passed in 35.50s build_karaf_tests_hybrid: OK ✔ in 53.8 seconds testsPCE: OK ✔ in 5 minutes 11.76 seconds tests121: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt tests121: freeze> python -m pip freeze --all tests121: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 tests121: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh 1.2.1 using environment variables from ./karaf121.env pytest -q transportpce_tests/1.2.1/test01_portmapping.py ..................... [100%] 21 passed in 85.08s (0:01:25) pytest -q transportpce_tests/1.2.1/test02_topo_portmapping.py ...... [100%] 6 passed in 44.07s pytest -q transportpce_tests/1.2.1/test03_topology.py ............................................. [100%] 44 passed in 136.99s (0:02:16) pytest -q transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py .................. [100%] 50 passed in 588.61s (0:09:48) pytest -q transportpce_tests/tapi/test02_full_topology.py ....... [100%] 24 passed in 81.26s (0:01:21) pytest -q transportpce_tests/1.2.1/test05_olm.py ...........F.FFFFFFFFFFF...... [100%] =================================== FAILURES =================================== _____________ TransportPCEtesting.test_12_check_openroadm_topology _____________ self = def test_12_check_openroadm_topology(self): response = test_utils.get_ietf_network_request('openroadm-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) > self.assertEqual(len(response['network'][0]['node']), 13, 'There should be 13 openroadm nodes') E AssertionError: 14 != 13 : There should be 13 openroadm nodes transportpce_tests/tapi/test02_full_topology.py:272: AssertionError ________________ TransportPCEtesting.test_14_check_sip_details _________________ self = def test_14_check_sip_details(self): response = test_utils.transportpce_api_rpc_request( 'tapi-common', 'get-service-interface-point-list', None) > self.assertEqual(len(response['output']['sip']), 72, 'There should be 72 service interface point') E AssertionError: 36 != 72 : There should be 72 service interface point transportpce_tests/tapi/test02_full_topology.py:291: AssertionError ____ TransportPCEtesting.test_15_create_connectivity_service_PhotonicMedia _____ self = def test_15_create_connectivity_service_PhotonicMedia(self): self.cr_serv_input_data["end-point"][0]["service-interface-point"]["service-interface-point-uuid"] = self.sAOTS self.cr_serv_input_data["end-point"][1]["service-interface-point"]["service-interface-point-uuid"] = self.sZOTS response = test_utils.transportpce_api_rpc_request( 'tapi-connectivity', 'create-connectivity-service', self.cr_serv_input_data) time.sleep(self.WAITING) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 transportpce_tests/tapi/test02_full_topology.py:300: AssertionError ____________ TransportPCEtesting.test_16_get_service_PhotonicMedia _____________ self = def test_16_get_service_PhotonicMedia(self): response = test_utils.get_ordm_serv_list_attr_request("services", str(self.uuid_services.pm)) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/tapi/test02_full_topology.py:330: AssertionError _________ TransportPCEtesting.test_17_create_connectivity_service_ODU __________ self = def test_17_create_connectivity_service_ODU(self): # pylint: disable=line-too-long self.cr_serv_input_data["layer-protocol-name"] = "ODU" self.cr_serv_input_data["end-point"][0]["layer-protocol-name"] = "ODU" self.cr_serv_input_data["end-point"][0]["service-interface-point"]["service-interface-point-uuid"] = self.sAeODU self.cr_serv_input_data["end-point"][1]["layer-protocol-name"] = "ODU" self.cr_serv_input_data["end-point"][1]["service-interface-point"]["service-interface-point-uuid"] = self.sZeODU # self.cr_serv_input_data["connectivity-constraint"]["service-layer"] = "ODU" self.cr_serv_input_data["connectivity-constraint"]["service-level"] = self.uuid_services.pm response = test_utils.transportpce_api_rpc_request( 'tapi-connectivity', 'create-connectivity-service', self.cr_serv_input_data) time.sleep(self.WAITING) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 transportpce_tests/tapi/test02_full_topology.py:351: AssertionError _________________ TransportPCEtesting.test_18_get_service_ODU __________________ self = def test_18_get_service_ODU(self): response = test_utils.get_ordm_serv_list_attr_request("services", str(self.uuid_services.odu)) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/tapi/test02_full_topology.py:379: AssertionError _________ TransportPCEtesting.test_19_create_connectivity_service_DSR __________ self = def test_19_create_connectivity_service_DSR(self): # pylint: disable=line-too-long self.cr_serv_input_data["layer-protocol-name"] = "DSR" self.cr_serv_input_data["end-point"][0]["layer-protocol-name"] = "DSR" self.cr_serv_input_data["end-point"][0]["service-interface-point"]["service-interface-point-uuid"] = self.sADSR self.cr_serv_input_data["end-point"][1]["layer-protocol-name"] = "DSR" self.cr_serv_input_data["end-point"][1]["service-interface-point"]["service-interface-point-uuid"] = self.sZDSR # self.cr_serv_input_data["connectivity-constraint"]["service-layer"] = "DSR" self.cr_serv_input_data["connectivity-constraint"]["requested-capacity"]["total-size"]["value"] = "10" self.cr_serv_input_data["connectivity-constraint"]["service-level"] = self.uuid_services.odu response = test_utils.transportpce_api_rpc_request( 'tapi-connectivity', 'create-connectivity-service', self.cr_serv_input_data) time.sleep(self.WAITING) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 transportpce_tests/tapi/test02_full_topology.py:401: AssertionError _________________ TransportPCEtesting.test_20_get_service_DSR __________________ self = def test_20_get_service_DSR(self): response = test_utils.get_ordm_serv_list_attr_request("services", str(self.uuid_services.dsr)) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/tapi/test02_full_topology.py:432: AssertionError __________ TransportPCEtesting.test_21_get_connectivity_service_list ___________ self = def test_21_get_connectivity_service_list(self): response = test_utils.transportpce_api_rpc_request( 'tapi-connectivity', 'get-connectivity-service-list', None) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 transportpce_tests/tapi/test02_full_topology.py:442: AssertionError _________ TransportPCEtesting.test_22_delete_connectivity_service_DSR __________ self = def test_22_delete_connectivity_service_DSR(self): self.del_serv_input_data["uuid"] = str(self.uuid_services.dsr) response = test_utils.transportpce_api_rpc_request( 'tapi-connectivity', 'delete-connectivity-service', self.del_serv_input_data) > self.assertIn(response["status_code"], (requests.codes.ok, requests.codes.no_content)) E AssertionError: 500 not found in (200, 204) transportpce_tests/tapi/test02_full_topology.py:471: AssertionError _________ TransportPCEtesting.test_23_delete_connectivity_service_ODU __________ self = def test_23_delete_connectivity_service_ODU(self): self.del_serv_input_data["uuid"] = str(self.uuid_services.odu) response = test_utils.transportpce_api_rpc_request( 'tapi-connectivity', 'delete-connectivity-service', self.del_serv_input_data) > self.assertIn(response["status_code"], (requests.codes.ok, requests.codes.no_content)) E AssertionError: 500 not found in (200, 204) transportpce_tests/tapi/test02_full_topology.py:478: AssertionError ____ TransportPCEtesting.test_24_delete_connectivity_service_PhotonicMedia _____ self = def test_24_delete_connectivity_service_PhotonicMedia(self): self.del_serv_input_data["uuid"] = str(self.uuid_services.pm) response = test_utils.transportpce_api_rpc_request( 'tapi-connectivity', 'delete-connectivity-service', self.del_serv_input_data) > self.assertIn(response["status_code"], (requests.codes.ok, requests.codes.no_content)) E AssertionError: 500 not found in (200, 204) transportpce_tests/tapi/test02_full_topology.py:485: AssertionError =========================== short test summary info ============================ FAILED transportpce_tests/tapi/test02_full_topology.py::TransportPCEtesting::test_12_check_openroadm_topology FAILED transportpce_tests/tapi/test02_full_topology.py::TransportPCEtesting::test_14_check_sip_details FAILED transportpce_tests/tapi/test02_full_topology.py::TransportPCEtesting::test_15_create_connectivity_service_PhotonicMedia FAILED transportpce_tests/tapi/test02_full_topology.py::TransportPCEtesting::test_16_get_service_PhotonicMedia FAILED transportpce_tests/tapi/test02_full_topology.py::TransportPCEtesting::test_17_create_connectivity_service_ODU FAILED transportpce_tests/tapi/test02_full_topology.py::TransportPCEtesting::test_18_get_service_ODU FAILED transportpce_tests/tapi/test02_full_topology.py::TransportPCEtesting::test_19_create_connectivity_service_DSR FAILED transportpce_tests/tapi/test02_full_topology.py::TransportPCEtesting::test_20_get_service_DSR FAILED transportpce_tests/tapi/test02_full_topology.py::TransportPCEtesting::test_21_get_connectivity_service_list FAILED transportpce_tests/tapi/test02_full_topology.py::TransportPCEtesting::test_22_delete_connectivity_service_DSR FAILED transportpce_tests/tapi/test02_full_topology.py::TransportPCEtesting::test_23_delete_connectivity_service_ODU FAILED transportpce_tests/tapi/test02_full_topology.py::TransportPCEtesting::test_24_delete_connectivity_service_PhotonicMedia 12 failed, 18 passed in 222.18s (0:03:42) tests_tapi: exit 1 (811.40 seconds) /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh tapi pid=30666 tests_tapi: FAIL ✖ in 13 minutes 38.36 seconds tests71: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt tests71: freeze> python -m pip freeze --all tests71: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 tests71: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh 7.1 using environment variables from ./karaf71.env pytest -q transportpce_tests/7.1/test01_portmapping.py ............. [100%] 12 passed in 45.42s pytest -q transportpce_tests/7.1/test02_otn_renderer.py .............................................................. [100%] 62 passed in 158.44s (0:02:38) pytest -q transportpce_tests/7.1/test03_renderer_or_modes.py .FFFFFFFFFFFFFFFFFFFFFFF.FFFF.FF.F.FFF............................................ [100%] 48 passed in 136.03s (0:02:16) pytest -q transportpce_tests/7.1/test04_renderer_regen_mode.py ...................... [100%] 22 passed in 72.41s (0:01:12) tests71: OK ✔ in 7 minutes 0.2 seconds tests221: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt tests221: freeze> python -m pip freeze --all tests221: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 tests221: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh 2.2.1 using environment variables from ./karaf221.env pytest -q transportpce_tests/2.2.1/test01_portmapping.py FFFFF [100%] =================================== FAILURES =================================== ______________ TransportOlmTesting.test_03_rdmA_device_connected _______________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'PUT' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01' body = '{"node": [{"node-id": "ROADMA01", "netconf-node-topology:netconf-node": {"netconf-node-topology:host": "127.0.0.1", "...is": "60000", "netconf-node-topology:max-connection-attempts": "0", "netconf-node-topology:keepalive-delay": "120"}}]}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '629', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'PUT' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_03_rdmA_device_connected(self): > response = test_utils.mount_device("ROADMA01", ('roadma-full', self.NODE_VERSION)) transportpce_tests/1.2.1/test05_olm.py:60: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:341: in mount_device response = put_request(url[RESTCONF_VERSION].format('{}', node), body) transportpce_tests/common/test_utils.py:124: in put_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_03_rdmA_device_connected ______________ TransportOlmTesting.test_04_rdmC_device_connected _______________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'PUT' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01' body = '{"node": [{"node-id": "ROADMC01", "netconf-node-topology:netconf-node": {"netconf-node-topology:host": "127.0.0.1", "...is": "60000", "netconf-node-topology:max-connection-attempts": "0", "netconf-node-topology:keepalive-delay": "120"}}]}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '629', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'PUT' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_04_rdmC_device_connected(self): > response = test_utils.mount_device("ROADMC01", ('roadmc-full', self.NODE_VERSION)) transportpce_tests/1.2.1/test05_olm.py:64: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:341: in mount_device response = put_request(url[RESTCONF_VERSION].format('{}', node), body) transportpce_tests/common/test_utils.py:124: in put_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_04_rdmC_device_connected _____________ TransportOlmTesting.test_05_connect_xpdrA_to_roadmA ______________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'POST' url = '/rests/operations/transportpce-networkutils:init-xpdr-rdm-links' body = '{"input": {"links-input": {"xpdr-node": "XPDRA01", "xpdr-num": "1", "network-num": "1", "rdm-node": "ROADMA01", "srg-num": "1", "termination-point-num": "SRG1-PP1-TXRX"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '171', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-networkutils:init-xpdr-rdm-links', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST' url = '/rests/operations/transportpce-networkutils:init-xpdr-rdm-links' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-xpdr-rdm-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_05_connect_xpdrA_to_roadmA(self): > response = test_utils.transportpce_api_rpc_request( 'transportpce-networkutils', 'init-xpdr-rdm-links', {'links-input': {'xpdr-node': 'XPDRA01', 'xpdr-num': '1', 'network-num': '1', 'rdm-node': 'ROADMA01', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX'}}) transportpce_tests/1.2.1/test05_olm.py:68: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:685: in transportpce_api_rpc_request response = post_request(url, data) transportpce_tests/common/test_utils.py:142: in post_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-xpdr-rdm-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_05_connect_xpdrA_to_roadmA _____________ TransportOlmTesting.test_06_connect_roadmA_to_xpdrA ______________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'POST' url = '/rests/operations/transportpce-networkutils:init-rdm-xpdr-links' body = '{"input": {"links-input": {"xpdr-node": "XPDRA01", "xpdr-num": "1", "network-num": "1", "rdm-node": "ROADMA01", "srg-num": "1", "termination-point-num": "SRG1-PP1-TXRX"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '171', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-networkutils:init-rdm-xpdr-links', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST' url = '/rests/operations/transportpce-networkutils:init-rdm-xpdr-links' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-rdm-xpdr-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_06_connect_roadmA_to_xpdrA(self): > response = test_utils.transportpce_api_rpc_request( 'transportpce-networkutils', 'init-rdm-xpdr-links', {'links-input': {'xpdr-node': 'XPDRA01', 'xpdr-num': '1', 'network-num': '1', 'rdm-node': 'ROADMA01', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX'}}) transportpce_tests/1.2.1/test05_olm.py:75: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:685: in transportpce_api_rpc_request response = post_request(url, data) transportpce_tests/common/test_utils.py:142: in post_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-rdm-xpdr-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_06_connect_roadmA_to_xpdrA _____________ TransportOlmTesting.test_07_connect_xpdrC_to_roadmC ______________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'POST' url = '/rests/operations/transportpce-networkutils:init-xpdr-rdm-links' body = '{"input": {"links-input": {"xpdr-node": "XPDRC01", "xpdr-num": "1", "network-num": "1", "rdm-node": "ROADMC01", "srg-num": "1", "termination-point-num": "SRG1-PP1-TXRX"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '171', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-networkutils:init-xpdr-rdm-links', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST' url = '/rests/operations/transportpce-networkutils:init-xpdr-rdm-links' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-xpdr-rdm-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_07_connect_xpdrC_to_roadmC(self): > response = test_utils.transportpce_api_rpc_request( 'transportpce-networkutils', 'init-xpdr-rdm-links', {'links-input': {'xpdr-node': 'XPDRC01', 'xpdr-num': '1', 'network-num': '1', 'rdm-node': 'ROADMC01', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX'}}) transportpce_tests/1.2.1/test05_olm.py:82: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:685: in transportpce_api_rpc_request response = post_request(url, data) transportpce_tests/common/test_utils.py:142: in post_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-xpdr-rdm-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_07_connect_xpdrC_to_roadmC _____________ TransportOlmTesting.test_08_connect_roadmC_to_xpdrC ______________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'POST' url = '/rests/operations/transportpce-networkutils:init-rdm-xpdr-links' body = '{"input": {"links-input": {"xpdr-node": "XPDRC01", "xpdr-num": "1", "network-num": "1", "rdm-node": "ROADMC01", "srg-num": "1", "termination-point-num": "SRG1-PP1-TXRX"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '171', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-networkutils:init-rdm-xpdr-links', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST' url = '/rests/operations/transportpce-networkutils:init-rdm-xpdr-links' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-rdm-xpdr-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_08_connect_roadmC_to_xpdrC(self): > response = test_utils.transportpce_api_rpc_request( 'transportpce-networkutils', 'init-rdm-xpdr-links', {'links-input': {'xpdr-node': 'XPDRC01', 'xpdr-num': '1', 'network-num': '1', 'rdm-node': 'ROADMC01', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX'}}) transportpce_tests/1.2.1/test05_olm.py:89: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:685: in transportpce_api_rpc_request response = post_request(url, data) transportpce_tests/common/test_utils.py:142: in post_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-networkutils:init-rdm-xpdr-links (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_08_connect_roadmC_to_xpdrC ________________ TransportOlmTesting.test_09_create_OTS_ROADMA _________________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'POST' url = '/rests/operations/transportpce-device-renderer:create-ots-oms' body = '{"input": {"node-id": "ROADMA01", "logical-connection-point": "DEG1-TTP-TXRX"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '79', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-device-renderer:create-ots-oms', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST' url = '/rests/operations/transportpce-device-renderer:create-ots-oms' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-device-renderer:create-ots-oms (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_09_create_OTS_ROADMA(self): > response = test_utils.transportpce_api_rpc_request( 'transportpce-device-renderer', 'create-ots-oms', { 'node-id': 'ROADMA01', 'logical-connection-point': 'DEG1-TTP-TXRX' }) transportpce_tests/1.2.1/test05_olm.py:96: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:685: in transportpce_api_rpc_request response = post_request(url, data) transportpce_tests/common/test_utils.py:142: in post_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-device-renderer:create-ots-oms (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_09_create_OTS_ROADMA ________________ TransportOlmTesting.test_10_create_OTS_ROADMC _________________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'POST' url = '/rests/operations/transportpce-device-renderer:create-ots-oms' body = '{"input": {"node-id": "ROADMC01", "logical-connection-point": "DEG2-TTP-TXRX"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '79', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-device-renderer:create-ots-oms', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST' url = '/rests/operations/transportpce-device-renderer:create-ots-oms' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-device-renderer:create-ots-oms (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_10_create_OTS_ROADMC(self): > response = test_utils.transportpce_api_rpc_request( 'transportpce-device-renderer', 'create-ots-oms', { 'node-id': 'ROADMC01', 'logical-connection-point': 'DEG2-TTP-TXRX' }) transportpce_tests/1.2.1/test05_olm.py:105: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:685: in transportpce_api_rpc_request response = post_request(url, data) transportpce_tests/common/test_utils.py:142: in post_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-device-renderer:create-ots-oms (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_10_create_OTS_ROADMC __________________ TransportOlmTesting.test_11_get_PM_ROADMA ___________________ self = def test_11_get_PM_ROADMA(self): response = test_utils.transportpce_api_rpc_request( 'transportpce-olm', 'get-pm', { 'node-id': 'ROADMA01', 'resource-type': 'interface', 'granularity': '15min', 'resource-identifier': { 'resource-name': 'OTS-DEG1-TTP-TXRX' } }) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 204 != 200 transportpce_tests/1.2.1/test05_olm.py:124: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_11_get_PM_ROADMA __________________ TransportOlmTesting.test_12_get_PM_ROADMC ___________________ self = def test_12_get_PM_ROADMC(self): response = test_utils.transportpce_api_rpc_request( 'transportpce-olm', 'get-pm', { 'node-id': 'ROADMC01', 'resource-type': 'interface', 'granularity': '15min', 'resource-identifier': { 'resource-name': 'OTS-DEG2-TTP-TXRX' } }) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 204 != 200 transportpce_tests/1.2.1/test05_olm.py:147: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_12_get_PM_ROADMC ______ TransportOlmTesting.test_13_calculate_span_loss_base_ROADMA_ROADMC ______ self = def test_13_calculate_span_loss_base_ROADMA_ROADMC(self): response = test_utils.transportpce_api_rpc_request( 'transportpce-olm', 'calculate-spanloss-base', { 'src-type': 'link', 'link-id': 'ROADMA01-DEG1-DEG1-TTP-TXRXtoROADMC01-DEG2-DEG2-TTP-TXRX' }) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 transportpce_tests/1.2.1/test05_olm.py:166: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_13_calculate_span_loss_base_ROADMA_ROADMC ___________ TransportOlmTesting.test_14_calculate_span_loss_base_all ___________ self = def test_14_calculate_span_loss_base_all(self): response = test_utils.transportpce_api_rpc_request( 'transportpce-olm', 'calculate-spanloss-base', { 'src-type': 'all' }) self.assertEqual(response['status_code'], requests.codes.ok) > self.assertIn('Success', response['output']['result']) E AssertionError: 'Success' not found in 'Failed' transportpce_tests/1.2.1/test05_olm.py:182: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_14_calculate_span_loss_base_all ___________ TransportOlmTesting.test_15_get_OTS_DEG1_TTP_TXRX_ROADMA ___________ self = def test_15_get_OTS_DEG1_TTP_TXRX_ROADMA(self): response = test_utils.check_node_attribute2_request( 'ROADMA01', 'interface', 'OTS-DEG1-TTP-TXRX', 'org-openroadm-optical-transport-interfaces:ots') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 503 != 200 transportpce_tests/1.2.1/test05_olm.py:197: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_15_get_OTS_DEG1_TTP_TXRX_ROADMA ___________ TransportOlmTesting.test_16_get_OTS_DEG2_TTP_TXRX_ROADMC ___________ self = def test_16_get_OTS_DEG2_TTP_TXRX_ROADMC(self): response = test_utils.check_node_attribute2_request( 'ROADMC01', 'interface', 'OTS-DEG2-TTP-TXRX', 'org-openroadm-optical-transport-interfaces:ots') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 503 != 200 transportpce_tests/1.2.1/test05_olm.py:208: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_16_get_OTS_DEG2_TTP_TXRX_ROADMC _____________ TransportOlmTesting.test_17_servicePath_create_AToZ ______________ self = def test_17_servicePath_create_AToZ(self): response = test_utils.transportpce_api_rpc_request( 'transportpce-device-renderer', 'service-path', { 'service-name': 'test', 'wave-number': '1', 'modulation-format': 'dp-qpsk', 'operation': 'create', 'nodes': [{'node-id': 'XPDRA01', 'dest-tp': 'XPDR1-NETWORK1', 'src-tp': 'XPDR1-CLIENT1'}, {'node-id': 'ROADMA01', 'dest-tp': 'DEG1-TTP-TXRX', 'src-tp': 'SRG1-PP1-TXRX'}, {'node-id': 'ROADMC01', 'dest-tp': 'SRG1-PP1-TXRX', 'src-tp': 'DEG2-TTP-TXRX'}, {'node-id': 'XPDRC01', 'dest-tp': 'XPDR1-CLIENT1', 'src-tp': 'XPDR1-NETWORK1'}], 'center-freq': 196.1, 'nmc-width': 40, 'min-freq': 196.075, 'max-freq': 196.125, 'lower-spectral-slot-number': 761, 'higher-spectral-slot-number': 768 }) self.assertEqual(response['status_code'], requests.codes.ok) > self.assertIn('Interfaces created successfully for nodes: ', response['output']['result']) E AssertionError: 'Interfaces created successfully for nodes: ' not found in 'ROADMC01 is not mounted on the controller\nXPDRC01 is not mounted on the controller\nROADMA01 is not mounted on the controller\nXPDRA01 is not mounted on the controller' transportpce_tests/1.2.1/test05_olm.py:237: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_17_servicePath_create_AToZ _____________ TransportOlmTesting.test_18_servicePath_create_ZToA ______________ self = def test_18_servicePath_create_ZToA(self): response = test_utils.transportpce_api_rpc_request( 'transportpce-device-renderer', 'service-path', { 'service-name': 'test', 'wave-number': '1', 'modulation-format': 'dp-qpsk', 'operation': 'create', 'nodes': [{'node-id': 'XPDRC01', 'dest-tp': 'XPDR1-NETWORK1', 'src-tp': 'XPDR1-CLIENT1'}, {'node-id': 'ROADMC01', 'dest-tp': 'DEG2-TTP-TXRX', 'src-tp': 'SRG1-PP1-TXRX'}, {'node-id': 'ROADMA01', 'src-tp': 'DEG1-TTP-TXRX', 'dest-tp': 'SRG1-PP1-TXRX'}, {'node-id': 'XPDRA01', 'src-tp': 'XPDR1-NETWORK1', 'dest-tp': 'XPDR1-CLIENT1'}], 'center-freq': 196.1, 'nmc-width': 40, 'min-freq': 196.075, 'max-freq': 196.125, 'lower-spectral-slot-number': 761, 'higher-spectral-slot-number': 768 }) self.assertEqual(response['status_code'], requests.codes.ok) > self.assertIn('Interfaces created successfully for nodes: ', response['output']['result']) E AssertionError: 'Interfaces created successfully for nodes: ' not found in 'ROADMA01 is not mounted on the controller\nXPDRC01 is not mounted on the controller\nXPDRA01 is not mounted on the controller\nROADMC01 is not mounted on the controller' transportpce_tests/1.2.1/test05_olm.py:265: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_18_servicePath_create_ZToA _________ TransportOlmTesting.test_19_service_power_setup_XPDRA_XPDRC __________ self = def test_19_service_power_setup_XPDRA_XPDRC(self): response = test_utils.transportpce_api_rpc_request( 'transportpce-olm', 'service-power-setup', { 'service-name': 'test', 'wave-number': 1, 'nodes': [ { 'dest-tp': 'XPDR1-NETWORK1', 'src-tp': 'XPDR1-CLIENT1', 'node-id': 'XPDRA01' }, { 'dest-tp': 'DEG1-TTP-TXRX', 'src-tp': 'SRG1-PP1-TXRX', 'node-id': 'ROADMA01' }, { 'dest-tp': 'SRG1-PP1-TXRX', 'src-tp': 'DEG2-TTP-TXRX', 'node-id': 'ROADMC01' }, { 'dest-tp': 'XPDR1-CLIENT1', 'src-tp': 'XPDR1-NETWORK1', 'node-id': 'XPDRC01' } ], 'center-freq': 196.1, 'nmc-width': 40, 'min-freq': 196.075, 'max-freq': 196.125, 'lower-spectral-slot-number': 761, 'higher-spectral-slot-number': 768 }) self.assertEqual(response['status_code'], requests.codes.ok) > self.assertIn('Success', response['output']['result']) E AssertionError: 'Success' not found in 'Failed' transportpce_tests/1.2.1/test05_olm.py:304: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_19_service_power_setup_XPDRA_XPDRC ________ TransportOlmTesting.test_20_get_interface_XPDRA_XPDR1_NETWORK1 ________ self = def test_20_get_interface_XPDRA_XPDR1_NETWORK1(self): response = test_utils.check_node_attribute2_request( 'XPDRA01', 'interface', 'XPDR1-NETWORK1-761:768', 'org-openroadm-optical-channel-interfaces:och') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 503 != 200 transportpce_tests/1.2.1/test05_olm.py:309: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_20_get_interface_XPDRA_XPDR1_NETWORK1 ____________ TransportOlmTesting.test_21_get_roadmconnection_ROADMA ____________ self = def test_21_get_roadmconnection_ROADMA(self): response = test_utils.check_node_attribute_request( 'ROADMA01', 'roadm-connections', 'SRG1-PP1-TXRX-DEG1-TTP-TXRX-761:768') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 503 != 200 transportpce_tests/1.2.1/test05_olm.py:316: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_21_get_roadmconnection_ROADMA ____________ TransportOlmTesting.test_22_get_roadmconnection_ROADMC ____________ self = def test_22_get_roadmconnection_ROADMC(self): response = test_utils.check_node_attribute_request( 'ROADMC01', 'roadm-connections', 'DEG2-TTP-TXRX-SRG1-PP1-TXRX-761:768') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 503 != 200 transportpce_tests/1.2.1/test05_olm.py:323: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_22_get_roadmconnection_ROADMC _________ TransportOlmTesting.test_23_service_power_setup_XPDRC_XPDRA __________ self = def test_23_service_power_setup_XPDRC_XPDRA(self): response = test_utils.transportpce_api_rpc_request( 'transportpce-olm', 'service-power-setup', { 'service-name': 'test', 'wave-number': 1, 'nodes': [ { 'dest-tp': 'XPDR1-NETWORK1', 'src-tp': 'XPDR1-CLIENT1', 'node-id': 'XPDRC01' }, { 'dest-tp': 'DEG2-TTP-TXRX', 'src-tp': 'SRG1-PP1-TXRX', 'node-id': 'ROADMC01' }, { 'src-tp': 'DEG1-TTP-TXRX', 'dest-tp': 'SRG1-PP1-TXRX', 'node-id': 'ROADMA01' }, { 'src-tp': 'XPDR1-NETWORK1', 'dest-tp': 'XPDR1-CLIENT1', 'node-id': 'XPDRA01' } ], 'center-freq': 196.1, 'nmc-width': 40, 'min-freq': 196.075, 'max-freq': 196.125, 'lower-spectral-slot-number': 761, 'higher-spectral-slot-number': 768 }) self.assertEqual(response['status_code'], requests.codes.ok) > self.assertIn('Success', response['output']['result']) E AssertionError: 'Success' not found in 'Failed' transportpce_tests/1.2.1/test05_olm.py:362: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_23_service_power_setup_XPDRC_XPDRA ________ TransportOlmTesting.test_24_get_interface_XPDRC_XPDR1_NETWORK1 ________ self = def test_24_get_interface_XPDRC_XPDR1_NETWORK1(self): response = test_utils.check_node_attribute2_request( 'XPDRC01', 'interface', 'XPDR1-NETWORK1-761:768', 'org-openroadm-optical-channel-interfaces:och') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 503 != 200 transportpce_tests/1.2.1/test05_olm.py:367: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_24_get_interface_XPDRC_XPDR1_NETWORK1 ____________ TransportOlmTesting.test_25_get_roadmconnection_ROADMC ____________ self = def test_25_get_roadmconnection_ROADMC(self): response = test_utils.check_node_attribute_request( 'ROADMC01', 'roadm-connections', 'SRG1-PP1-TXRX-DEG2-TTP-TXRX-761:768') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 503 != 200 transportpce_tests/1.2.1/test05_olm.py:374: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_25_get_roadmconnection_ROADMC ________ TransportOlmTesting.test_26_service_power_turndown_XPDRA_XPDRC ________ self = def test_26_service_power_turndown_XPDRA_XPDRC(self): response = test_utils.transportpce_api_rpc_request( 'transportpce-olm', 'service-power-turndown', { 'service-name': 'test', 'wave-number': 1, 'nodes': [ { 'dest-tp': 'XPDR1-NETWORK1', 'src-tp': 'XPDR1-CLIENT1', 'node-id': 'XPDRA01' }, { 'dest-tp': 'DEG1-TTP-TXRX', 'src-tp': 'SRG1-PP1-TXRX', 'node-id': 'ROADMA01' }, { 'dest-tp': 'SRG1-PP1-TXRX', 'src-tp': 'DEG2-TTP-TXRX', 'node-id': 'ROADMC01' }, { 'dest-tp': 'XPDR1-CLIENT1', 'src-tp': 'XPDR1-NETWORK1', 'node-id': 'XPDRC01' } ], 'center-freq': 196.1, 'nmc-width': 40, 'min-freq': 196.075, 'max-freq': 196.125, 'lower-spectral-slot-number': 761, 'higher-spectral-slot-number': 768 }) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 transportpce_tests/1.2.1/test05_olm.py:413: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_26_service_power_turndown_XPDRA_XPDRC ____________ TransportOlmTesting.test_27_get_roadmconnection_ROADMA ____________ self = def test_27_get_roadmconnection_ROADMA(self): response = test_utils.check_node_attribute_request( 'ROADMA01', 'roadm-connections', 'SRG1-PP1-TXRX-DEG1-TTP-TXRX-761:768') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 503 != 200 transportpce_tests/1.2.1/test05_olm.py:419: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_27_get_roadmconnection_ROADMA ____________ TransportOlmTesting.test_28_get_roadmconnection_ROADMC ____________ self = def test_28_get_roadmconnection_ROADMC(self): response = test_utils.check_node_attribute_request( 'ROADMC01', 'roadm-connections', 'DEG2-TTP-TXRX-SRG1-PP1-TXRX-761:768') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 503 != 200 transportpce_tests/1.2.1/test05_olm.py:426: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_28_get_roadmconnection_ROADMC _____________ TransportOlmTesting.test_29_servicePath_delete_AToZ ______________ self = def test_29_servicePath_delete_AToZ(self): response = test_utils.transportpce_api_rpc_request( 'transportpce-device-renderer', 'service-path', { 'service-name': 'test', 'wave-number': '1', 'modulation-format': 'dp-qpsk', 'operation': 'delete', 'nodes': [{'node-id': 'XPDRA01', 'dest-tp': 'XPDR1-NETWORK1', 'src-tp': 'XPDR1-CLIENT1'}, {'node-id': 'ROADMA01', 'dest-tp': 'DEG1-TTP-TXRX', 'src-tp': 'SRG1-PP1-TXRX'}, {'node-id': 'ROADMC01', 'dest-tp': 'SRG1-PP1-TXRX', 'src-tp': 'DEG2-TTP-TXRX'}, {'node-id': 'XPDRC01', 'dest-tp': 'XPDR1-CLIENT1', 'src-tp': 'XPDR1-NETWORK1'}], 'center-freq': 196.1, 'nmc-width': 40, 'min-freq': 196.075, 'max-freq': 196.125, 'lower-spectral-slot-number': 761, 'higher-spectral-slot-number': 768 }) self.assertEqual(response['status_code'], requests.codes.ok) > self.assertIn('Request processed', response['output']['result']) E AssertionError: 'Request processed' not found in 'ROADMC01 is not mounted on the controller\nXPDRC01 is not mounted on the controller\nXPDRA01 is not mounted on the controller\nROADMA01 is not mounted on the controller' transportpce_tests/1.2.1/test05_olm.py:454: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_29_servicePath_delete_AToZ _____________ TransportOlmTesting.test_30_servicePath_delete_ZToA ______________ self = def test_30_servicePath_delete_ZToA(self): response = test_utils.transportpce_api_rpc_request( 'transportpce-device-renderer', 'service-path', { 'service-name': 'test', 'wave-number': '1', 'modulation-format': 'dp-qpsk', 'operation': 'delete', 'nodes': [{'node-id': 'XPDRC01', 'dest-tp': 'XPDR1-NETWORK1', 'src-tp': 'XPDR1-CLIENT1'}, {'node-id': 'ROADMC01', 'dest-tp': 'DEG2-TTP-TXRX', 'src-tp': 'SRG1-PP1-TXRX'}, {'node-id': 'ROADMA01', 'src-tp': 'DEG1-TTP-TXRX', 'dest-tp': 'SRG1-PP1-TXRX'}, {'node-id': 'XPDRA01', 'src-tp': 'XPDR1-NETWORK1', 'dest-tp': 'XPDR1-CLIENT1'}], 'center-freq': 196.1, 'nmc-width': 40, 'min-freq': 196.075, 'max-freq': 196.125, 'lower-spectral-slot-number': 761, 'higher-spectral-slot-number': 768 }) self.assertEqual(response['status_code'], requests.codes.ok) > self.assertIn('Request processed', response['output']['result']) E AssertionError: 'Request processed' not found in 'XPDRA01 is not mounted on the controller\nXPDRC01 is not mounted on the controller\nROADMA01 is not mounted on the controller\nROADMC01 is not mounted on the controller' transportpce_tests/1.2.1/test05_olm.py:482: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_30_servicePath_delete_ZToA _____________ TransportOlmTesting.test_31_connect_xpdrA_to_roadmA ______________ self = def test_31_connect_xpdrA_to_roadmA(self): response = test_utils.transportpce_api_rpc_request( 'transportpce-networkutils', 'init-xpdr-rdm-links', {'links-input': {'xpdr-node': 'XPDRA01', 'xpdr-num': '1', 'network-num': '2', 'rdm-node': 'ROADMA01', 'srg-num': '1', 'termination-point-num': 'SRG1-PP2-TXRX'}}) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 204 != 200 transportpce_tests/1.2.1/test05_olm.py:492: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_31_connect_xpdrA_to_roadmA _____________ TransportOlmTesting.test_32_connect_roadmA_to_xpdrA ______________ self = def test_32_connect_roadmA_to_xpdrA(self): response = test_utils.transportpce_api_rpc_request( 'transportpce-networkutils', 'init-rdm-xpdr-links', {'links-input': {'xpdr-node': 'XPDRA01', 'xpdr-num': '1', 'network-num': '2', 'rdm-node': 'ROADMA01', 'srg-num': '1', 'termination-point-num': 'SRG1-PP2-TXRX'}}) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 204 != 200 transportpce_tests/1.2.1/test05_olm.py:499: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_32_connect_roadmA_to_xpdrA _____________ TransportOlmTesting.test_33_servicePath_create_AToZ ______________ self = def test_33_servicePath_create_AToZ(self): response = test_utils.transportpce_api_rpc_request( 'transportpce-device-renderer', 'service-path', { 'service-name': 'test2', 'wave-number': '2', 'modulation-format': 'dp-qpsk', 'operation': 'create', 'nodes': [{'node-id': 'XPDRA01', 'dest-tp': 'XPDR1-NETWORK2', 'src-tp': 'XPDR1-CLIENT2'}, {'node-id': 'ROADMA01', 'dest-tp': 'DEG1-TTP-TXRX', 'src-tp': 'SRG1-PP2-TXRX'}], 'center-freq': 196.05, 'nmc-width': 40, 'min-freq': 196.025, 'max-freq': 196.075, 'lower-spectral-slot-number': 753, 'higher-spectral-slot-number': 760 }) self.assertEqual(response['status_code'], requests.codes.ok) > self.assertIn('Interfaces created successfully for nodes', response['output']['result']) E AssertionError: 'Interfaces created successfully for nodes' not found in 'ROADMA01 is not mounted on the controller\nXPDRA01 is not mounted on the controller' transportpce_tests/1.2.1/test05_olm.py:522: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_33_servicePath_create_AToZ ________ TransportOlmTesting.test_34_get_interface_XPDRA_XPDR1_NETWORK2 ________ self = def test_34_get_interface_XPDRA_XPDR1_NETWORK2(self): response = test_utils.check_node_attribute2_request( 'XPDRA01', 'interface', 'XPDR1-NETWORK2-753:760', 'org-openroadm-optical-channel-interfaces:och') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 503 != 200 transportpce_tests/1.2.1/test05_olm.py:528: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_34_get_interface_XPDRA_XPDR1_NETWORK2 _____________ TransportOlmTesting.test_35_servicePath_delete_AToZ ______________ self = def test_35_servicePath_delete_AToZ(self): response = test_utils.transportpce_api_rpc_request( 'transportpce-device-renderer', 'service-path', { 'service-name': 'test2', 'wave-number': '2', 'modulation-format': 'dp-qpsk', 'operation': 'delete', 'nodes': [{'node-id': 'XPDRA01', 'dest-tp': 'XPDR1-NETWORK2', 'src-tp': 'XPDR1-CLIENT2'}, {'node-id': 'ROADMA01', 'dest-tp': 'DEG1-TTP-TXRX', 'src-tp': 'SRG1-PP2-TXRX'}], 'center-freq': 196.05, 'nmc-width': 40, 'min-freq': 196.025, 'max-freq': 196.075, 'lower-spectral-slot-number': 753, 'higher-spectral-slot-number': 760 }) self.assertEqual(response['status_code'], requests.codes.ok) > self.assertIn('Request processed', response['output']['result']) E AssertionError: 'Request processed' not found in 'ROADMA01 is not mounted on the controller\nXPDRA01 is not mounted on the controller' transportpce_tests/1.2.1/test05_olm.py:553: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_35_servicePath_delete_AToZ ____________ TransportOlmTesting.test_36_xpdrA_device_disconnected _____________ self = def test_36_xpdrA_device_disconnected(self): response = test_utils.unmount_device("XPDRA01") > self.assertIn(response.status_code, (requests.codes.ok, requests.codes.no_content)) E AssertionError: 409 not found in (200, 204) transportpce_tests/1.2.1/test05_olm.py:558: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_36_xpdrA_device_disconnected Searching for patterns in karaf.log... Pattern not found after 180 seconds! Node XPDRA01 still not deleted from tpce topology... ____________ TransportOlmTesting.test_37_xpdrC_device_disconnected _____________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'DELETE' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRC01' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '0', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRC01', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'DELETE' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRC01' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRC01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_37_xpdrC_device_disconnected(self): > response = test_utils.unmount_device("XPDRC01") transportpce_tests/1.2.1/test05_olm.py:561: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:358: in unmount_device response = delete_request(url[RESTCONF_VERSION].format('{}', node)) transportpce_tests/common/test_utils.py:133: in delete_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRC01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_37_xpdrC_device_disconnected ___________ TransportOlmTesting.test_38_calculate_span_loss_current ____________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'POST' url = '/rests/operations/transportpce-olm:calculate-spanloss-current' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '0', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-olm:calculate-spanloss-current', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST' url = '/rests/operations/transportpce-olm:calculate-spanloss-current' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-olm:calculate-spanloss-current (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_38_calculate_span_loss_current(self): > response = test_utils.transportpce_api_rpc_request( 'transportpce-olm', 'calculate-spanloss-current', None) transportpce_tests/1.2.1/test05_olm.py:565: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:685: in transportpce_api_rpc_request response = post_request(url, data) transportpce_tests/common/test_utils.py:148: in post_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/operations/transportpce-olm:calculate-spanloss-current (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_38_calculate_span_loss_current _____________ TransportOlmTesting.test_39_rdmA_device_disconnected _____________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'DELETE' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '0', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'DELETE' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_39_rdmA_device_disconnected(self): > response = test_utils.unmount_device("ROADMA01") transportpce_tests/1.2.1/test05_olm.py:574: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:358: in unmount_device response = delete_request(url[RESTCONF_VERSION].format('{}', node)) transportpce_tests/common/test_utils.py:133: in delete_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_39_rdmA_device_disconnected _____________ TransportOlmTesting.test_40_rdmC_device_disconnected _____________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'DELETE' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '0', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'DELETE' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_40_rdmC_device_disconnected(self): > response = test_utils.unmount_device("ROADMC01") transportpce_tests/1.2.1/test05_olm.py:578: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:358: in unmount_device response = delete_request(url[RESTCONF_VERSION].format('{}', node)) transportpce_tests/common/test_utils.py:133: in delete_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMC01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_40_rdmC_device_disconnected --------------------------- Captured stdout teardown --------------------------- all processes killed =========================== short test summary info ============================ FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_03_rdmA_device_connected FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_04_rdmC_device_connected FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_05_connect_xpdrA_to_roadmA FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_06_connect_roadmA_to_xpdrA FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_07_connect_xpdrC_to_roadmC FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_08_connect_roadmC_to_xpdrC FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_09_create_OTS_ROADMA FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_10_create_OTS_ROADMC FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_11_get_PM_ROADMA FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_12_get_PM_ROADMC FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_13_calculate_span_loss_base_ROADMA_ROADMC FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_14_calculate_span_loss_base_all FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_15_get_OTS_DEG1_TTP_TXRX_ROADMA FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_16_get_OTS_DEG2_TTP_TXRX_ROADMC FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_17_servicePath_create_AToZ FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_18_servicePath_create_ZToA FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_19_service_power_setup_XPDRA_XPDRC FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_20_get_interface_XPDRA_XPDR1_NETWORK1 FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_21_get_roadmconnection_ROADMA FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_22_get_roadmconnection_ROADMC FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_23_service_power_setup_XPDRC_XPDRA FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_24_get_interface_XPDRC_XPDR1_NETWORK1 FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_25_get_roadmconnection_ROADMC FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_26_service_power_turndown_XPDRA_XPDRC FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_27_get_roadmconnection_ROADMA FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_28_get_roadmconnection_ROADMC FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_29_servicePath_delete_AToZ FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_30_servicePath_delete_ZToA FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_31_connect_xpdrA_to_roadmA FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_32_connect_roadmA_to_xpdrA FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_33_servicePath_create_AToZ FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_34_get_interface_XPDRA_XPDR1_NETWORK2 FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_35_servicePath_delete_AToZ FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_36_xpdrA_device_disconnected FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_37_xpdrC_device_disconnected FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_38_calculate_span_loss_current FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_39_rdmA_device_disconnected FAILED transportpce_tests/1.2.1/test05_olm.py::TransportOlmTesting::test_40_rdmC_device_disconnected 38 failed, 2 passed in 649.98s (0:10:49) tests121: exit 1 (998.58 seconds) /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh 1.2.1 pid=35353 ................................... [100%] 35 passed in 76.59s (0:01:16) pytest -q transportpce_tests/2.2.1/test02_topo_portmapping.py ...... [100%] 6 passed in 46.15s pytest -q transportpce_tests/2.2.1/test03_topology.py ............................................ [100%] 44 passed in 318.49s (0:05:18) pytest -q transportpce_tests/2.2.1/test04_otn_topology.py ............ [100%] 12 passed in 59.68s pytest -q transportpce_tests/2.2.1/test05_flex_grid.py ................ [100%] 16 passed in 114.79s (0:01:54) pytest -q transportpce_tests/2.2.1/test06_renderer_service_path_nominal.py ............................... [100%] 31 passed in 35.46s pytest -q transportpce_tests/2.2.1/test07_otn_renderer.py .......................... [100%] 26 passed in 89.96s (0:01:29) pytest -q transportpce_tests/2.2.1/test08_otn_sh_renderer.py ...................... [100%] 22 passed in 99.11s (0:01:39) pytest -q transportpce_tests/2.2.1/test09_olm.py ........................................ [100%] 40 passed in 362.46s (0:06:02) pytest -q transportpce_tests/2.2.1/test11_otn_end2end.py ........................................................................ [ 74%] ......................... [100%] 97 passed in 489.60s (0:08:09) pytest -q transportpce_tests/2.2.1/test12_end2end.py ...................................................... [100%] 54 passed in 445.56s (0:07:25) pytest -q transportpce_tests/2.2.1/test14_otn_switch_end2end.py ........................................................................ [ 71%] ............................. [100%] 101 passed in 490.12s (0:08:10) pytest -q transportpce_tests/2.2.1/test15_otn_end2end_with_intermediate_switch.py ........................................................................ [ 67%] ................................... [100%] 107 passed in 600.52s (0:10:00) tests121: FAIL ✖ in 16 minutes 44.93 seconds tests221: OK ✔ in 53 minutes 57.75 seconds tests_hybrid: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt tests_hybrid: freeze> python -m pip freeze --all tests_hybrid: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 tests_hybrid: commands[0] /w/workspace/transportpce-tox-verify-scandium/tests> ./launch_tests.sh hybrid using environment variables from ./karaf121.env pytest -q transportpce_tests/hybrid/test01_device_change_notifications.py ................................................... [100%] 51 passed in 149.96s (0:02:29) pytest -q transportpce_tests/hybrid/test02_B100G_end2end.py ........................................................................ [ 66%] ..................................... [100%] 109 passed in 426.41s (0:07:06) pytest -q transportpce_tests/hybrid/test03_autonomous_reroute.py ..................................................... [100%] 53 passed in 258.06s (0:04:18) tests_hybrid: OK ✔ in 14 minutes 1.17 seconds buildlighty: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-scandium/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-scandium/tests/test-requirements.txt buildlighty: freeze> python -m pip freeze --all buildlighty: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 buildlighty: commands[0] /w/workspace/transportpce-tox-verify-scandium/lighty> ./build.sh NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED [ERROR] COMPILATION ERROR : [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[17,42] cannot find symbol symbol: class YangModuleInfo location: package org.opendaylight.yangtools.binding [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[21,30] cannot find symbol symbol: class YangModuleInfo location: class io.lighty.controllers.tpce.utils.TPCEUtils [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[343,30] cannot find symbol symbol: class YangModuleInfo location: class io.lighty.controllers.tpce.utils.TPCEUtils [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[350,23] cannot find symbol symbol: class YangModuleInfo location: class io.lighty.controllers.tpce.utils.TPCEUtils [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.13.0:compile (default-compile) on project tpce: Compilation failure: Compilation failure: [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[17,42] cannot find symbol [ERROR] symbol: class YangModuleInfo [ERROR] location: package org.opendaylight.yangtools.binding [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[21,30] cannot find symbol [ERROR] symbol: class YangModuleInfo [ERROR] location: class io.lighty.controllers.tpce.utils.TPCEUtils [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[343,30] cannot find symbol [ERROR] symbol: class YangModuleInfo [ERROR] location: class io.lighty.controllers.tpce.utils.TPCEUtils [ERROR] /w/workspace/transportpce-tox-verify-scandium/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[350,23] cannot find symbol [ERROR] symbol: class YangModuleInfo [ERROR] location: class io.lighty.controllers.tpce.utils.TPCEUtils [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException unzip: cannot find or open target/tpce-bin.zip, target/tpce-bin.zip.zip or target/tpce-bin.zip.ZIP. buildlighty: exit 9 (13.33 seconds) /w/workspace/transportpce-tox-verify-scandium/lighty> ./build.sh pid=62478 buildlighty: command failed but is marked ignore outcome so handling it as success buildcontroller: OK (107.52=setup[8.61]+cmd[98.91] seconds) testsPCE: OK (311.76=setup[72.90]+cmd[238.86] seconds) sims: OK (9.91=setup[7.51]+cmd[2.40] seconds) build_karaf_tests121: OK (55.75=setup[7.67]+cmd[48.08] seconds) tests121: FAIL code 1 (1004.93=setup[6.35]+cmd[998.58] seconds) build_karaf_tests221: OK (54.67=setup[7.62]+cmd[47.05] seconds) tests_tapi: FAIL code 1 (818.36=setup[6.96]+cmd[811.40] seconds) tests221: OK (3237.75=setup[6.34]+cmd[3231.41] seconds) build_karaf_tests71: OK (51.96=setup[11.23]+cmd[40.73] seconds) tests71: OK (420.20=setup[7.01]+cmd[413.19] seconds) build_karaf_tests_hybrid: OK (53.80=setup[7.84]+cmd[45.96] seconds) tests_hybrid: OK (841.17=setup[6.07]+cmd[835.11] seconds) buildlighty: OK (19.57=setup[6.24]+cmd[13.33] seconds) docs: OK (33.06=setup[30.55]+cmd[2.51] seconds) docs-linkcheck: OK (35.30=setup[31.01]+cmd[4.29] seconds) checkbashisms: OK (2.88=setup[1.96]+cmd[0.01,0.06,0.85] seconds) pre-commit: OK (44.69=setup[3.60]+cmd[0.01,0.01,33.48,7.60] seconds) pylint: OK (27.53=setup[6.26]+cmd[21.27] seconds) evaluation failed :( (5500.42 seconds) + tox_status=255 + echo '---> Completed tox runs' ---> Completed tox runs + for i in .tox/*/log ++ echo .tox/build_karaf_tests121/log ++ awk -F/ '{print $2}' + tox_env=build_karaf_tests121 + cp -r .tox/build_karaf_tests121/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/build_karaf_tests121 + for i in .tox/*/log ++ echo .tox/build_karaf_tests221/log ++ awk -F/ '{print $2}' + tox_env=build_karaf_tests221 + cp -r .tox/build_karaf_tests221/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/build_karaf_tests221 + for i in .tox/*/log ++ echo .tox/build_karaf_tests71/log ++ awk -F/ '{print $2}' + tox_env=build_karaf_tests71 + cp -r .tox/build_karaf_tests71/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/build_karaf_tests71 + for i in .tox/*/log ++ echo .tox/build_karaf_tests_hybrid/log ++ awk -F/ '{print $2}' + tox_env=build_karaf_tests_hybrid + cp -r .tox/build_karaf_tests_hybrid/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/build_karaf_tests_hybrid + for i in .tox/*/log ++ echo .tox/buildcontroller/log ++ awk -F/ '{print $2}' + tox_env=buildcontroller + cp -r .tox/buildcontroller/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/buildcontroller + for i in .tox/*/log ++ echo .tox/buildlighty/log ++ awk -F/ '{print $2}' + tox_env=buildlighty + cp -r .tox/buildlighty/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/buildlighty + for i in .tox/*/log ++ echo .tox/checkbashisms/log ++ awk -F/ '{print $2}' + tox_env=checkbashisms + cp -r .tox/checkbashisms/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/checkbashisms + for i in .tox/*/log ++ echo .tox/docs-linkcheck/log ++ awk -F/ '{print $2}' + tox_env=docs-linkcheck + cp -r .tox/docs-linkcheck/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/docs-linkcheck + for i in .tox/*/log ++ echo .tox/docs/log ++ awk -F/ '{print $2}' + tox_env=docs + cp -r .tox/docs/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/docs + for i in .tox/*/log ++ echo .tox/pre-commit/log ++ awk -F/ '{print $2}' + tox_env=pre-commit + cp -r .tox/pre-commit/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/pre-commit + for i in .tox/*/log ++ echo .tox/pylint/log ++ awk -F/ '{print $2}' + tox_env=pylint + cp -r .tox/pylint/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/pylint + for i in .tox/*/log ++ echo .tox/sims/log ++ awk -F/ '{print $2}' + tox_env=sims + cp -r .tox/sims/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/sims + for i in .tox/*/log ++ echo .tox/tests121/log ++ awk -F/ '{print $2}' + tox_env=tests121 + cp -r .tox/tests121/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/tests121 + for i in .tox/*/log ++ awk -F/ '{print $2}' ++ echo .tox/tests221/log + tox_env=tests221 + cp -r .tox/tests221/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/tests221 + for i in .tox/*/log ++ echo .tox/tests71/log ++ awk -F/ '{print $2}' + tox_env=tests71 + cp -r .tox/tests71/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/tests71 + for i in .tox/*/log ++ echo .tox/testsPCE/log ++ awk -F/ '{print $2}' + tox_env=testsPCE + cp -r .tox/testsPCE/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/testsPCE + for i in .tox/*/log ++ echo .tox/tests_hybrid/log ++ awk -F/ '{print $2}' + tox_env=tests_hybrid + cp -r .tox/tests_hybrid/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/tests_hybrid + for i in .tox/*/log ++ echo .tox/tests_tapi/log ++ awk -F/ '{print $2}' + tox_env=tests_tapi + cp -r .tox/tests_tapi/log /w/workspace/transportpce-tox-verify-scandium/archives/tox/tests_tapi + DOC_DIR=docs/_build/html + [[ -d docs/_build/html ]] + echo '---> Archiving generated docs' ---> Archiving generated docs + mv docs/_build/html /w/workspace/transportpce-tox-verify-scandium/archives/docs + echo '---> tox-run.sh ends' ---> tox-run.sh ends + test 255 -eq 0 + exit 255 ++ '[' 1 = 1 ']' ++ '[' -x /usr/bin/clear_console ']' ++ /usr/bin/clear_console -q Build step 'Execute shell' marked build as failure $ ssh-agent -k unset SSH_AUTH_SOCK; unset SSH_AGENT_PID; echo Agent pid 13753 killed; [ssh-agent] Stopped. [PostBuildScript] - [INFO] Executing post build scripts. [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins17796909327606308304.sh ---> sysstat.sh [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins9076533529583009392.sh ---> package-listing.sh ++ facter osfamily ++ tr '[:upper:]' '[:lower:]' + OS_FAMILY=debian + workspace=/w/workspace/transportpce-tox-verify-scandium + START_PACKAGES=/tmp/packages_start.txt + END_PACKAGES=/tmp/packages_end.txt + DIFF_PACKAGES=/tmp/packages_diff.txt + PACKAGES=/tmp/packages_start.txt + '[' /w/workspace/transportpce-tox-verify-scandium ']' + PACKAGES=/tmp/packages_end.txt + case "${OS_FAMILY}" in + dpkg -l + grep '^ii' + '[' -f /tmp/packages_start.txt ']' + '[' -f /tmp/packages_end.txt ']' + diff /tmp/packages_start.txt /tmp/packages_end.txt + '[' /w/workspace/transportpce-tox-verify-scandium ']' + mkdir -p /w/workspace/transportpce-tox-verify-scandium/archives/ + cp -f /tmp/packages_diff.txt /tmp/packages_end.txt /tmp/packages_start.txt /w/workspace/transportpce-tox-verify-scandium/archives/ [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins7725934898053167576.sh ---> capture-instance-metadata.sh Setup pyenv: system 3.8.13 3.9.13 3.10.13 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-scandium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-LJis from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools lf-activate-venv(): INFO: Adding /tmp/venv-LJis/bin to PATH INFO: Running in OpenStack, capturing instance metadata [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins11260053417714539384.sh provisioning config files... Could not find credentials [logs] for transportpce-tox-verify-scandium #9 copy managed file [jenkins-log-archives-settings] to file:/w/workspace/transportpce-tox-verify-scandium@tmp/config5340864712828351738tmp Regular expression run condition: Expression=[^.*logs-s3.*], Label=[odl-logs-s3-cloudfront-index] Run condition [Regular expression match] enabling perform for step [Provide Configuration files] provisioning config files... copy managed file [jenkins-s3-log-ship] to file:/home/jenkins/.aws/credentials [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content SERVER_ID=logs [EnvInject] - Variables injected successfully. [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins17066106577384889922.sh ---> create-netrc.sh WARN: Log server credential not found. [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins2245750498897475638.sh ---> python-tools-install.sh Setup pyenv: system 3.8.13 3.9.13 3.10.13 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-scandium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-LJis from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools lf-activate-venv(): INFO: Adding /tmp/venv-LJis/bin to PATH [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins6641001079058735345.sh ---> sudo-logs.sh Archiving 'sudo' log.. [transportpce-tox-verify-scandium] $ /bin/bash /tmp/jenkins7974531766646312104.sh ---> job-cost.sh Setup pyenv: system 3.8.13 3.9.13 3.10.13 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-scandium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-LJis from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: zipp==1.1.0 python-openstackclient urllib3~=1.26.15 lf-activate-venv(): INFO: Adding /tmp/venv-LJis/bin to PATH INFO: No Stack... INFO: Retrieving Pricing Info for: v3-standard-4 INFO: Archiving Costs [transportpce-tox-verify-scandium] $ /bin/bash -l /tmp/jenkins13712769613140728391.sh ---> logs-deploy.sh Setup pyenv: system 3.8.13 3.9.13 3.10.13 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-scandium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-LJis from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools lf-activate-venv(): INFO: Adding /tmp/venv-LJis/bin to PATH WARNING: Nexus logging server not set INFO: S3 path logs/releng/vex-yul-odl-jenkins-1/transportpce-tox-verify-scandium/9/ INFO: archiving logs to S3 ---> uname -a: Linux prd-ubuntu2004-docker-4c-16g-24687 5.4.0-190-generic #210-Ubuntu SMP Fri Jul 5 17:03:38 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux ---> lscpu: Architecture: x86_64 CPU op-mode(s): 32-bit, 64-bit Byte Order: Little Endian Address sizes: 40 bits physical, 48 bits virtual CPU(s): 4 On-line CPU(s) list: 0-3 Thread(s) per core: 1 Core(s) per socket: 1 Socket(s): 4 NUMA node(s): 1 Vendor ID: AuthenticAMD CPU family: 23 Model: 49 Model name: AMD EPYC-Rome Processor Stepping: 0 CPU MHz: 2799.998 BogoMIPS: 5599.99 Virtualization: AMD-V Hypervisor vendor: KVM Virtualization type: full L1d cache: 128 KiB L1i cache: 128 KiB L2 cache: 2 MiB L3 cache: 64 MiB NUMA node0 CPU(s): 0-3 Vulnerability Gather data sampling: Not affected Vulnerability Itlb multihit: Not affected Vulnerability L1tf: Not affected Vulnerability Mds: Not affected Vulnerability Meltdown: Not affected Vulnerability Mmio stale data: Not affected Vulnerability Retbleed: Vulnerable Vulnerability Spec store bypass: Mitigation; Speculative Store Bypass disabled via prctl and seccomp Vulnerability Spectre v1: Mitigation; usercopy/swapgs barriers and __user pointer sanitization Vulnerability Spectre v2: Mitigation; Retpolines; IBPB conditional; IBRS_FW; STIBP disabled; RSB filling; PBRSB-eIBRS Not affected; BHI Not affected Vulnerability Srbds: Not affected Vulnerability Tsx async abort: Not affected Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy svm cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase tsc_adjust bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 clzero xsaveerptr wbnoinvd arat npt nrip_save umip rdpid arch_capabilities ---> nproc: 4 ---> df -h: Filesystem Size Used Avail Use% Mounted on udev 7.8G 0 7.8G 0% /dev tmpfs 1.6G 1.1M 1.6G 1% /run /dev/vda1 78G 17G 62G 21% / tmpfs 7.9G 0 7.9G 0% /dev/shm tmpfs 5.0M 0 5.0M 0% /run/lock tmpfs 7.9G 0 7.9G 0% /sys/fs/cgroup /dev/loop0 62M 62M 0 100% /snap/core20/1405 /dev/loop1 68M 68M 0 100% /snap/lxd/22753 /dev/vda15 105M 6.1M 99M 6% /boot/efi tmpfs 1.6G 0 1.6G 0% /run/user/1001 /dev/loop3 39M 39M 0 100% /snap/snapd/21759 /dev/loop4 64M 64M 0 100% /snap/core20/2379 /dev/loop5 92M 92M 0 100% /snap/lxd/29619 ---> free -m: total used free shared buff/cache available Mem: 15997 671 8486 1 6839 14986 Swap: 1023 0 1023 ---> ip addr: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: ens3: mtu 1458 qdisc mq state UP group default qlen 1000 link/ether fa:16:3e:bc:f0:74 brd ff:ff:ff:ff:ff:ff inet 10.30.170.208/23 brd 10.30.171.255 scope global dynamic ens3 valid_lft 80718sec preferred_lft 80718sec inet6 fe80::f816:3eff:febc:f074/64 scope link valid_lft forever preferred_lft forever 3: docker0: mtu 1458 qdisc noqueue state DOWN group default link/ether 02:42:98:de:af:8b brd ff:ff:ff:ff:ff:ff inet 10.250.0.254/24 brd 10.250.0.255 scope global docker0 valid_lft forever preferred_lft forever ---> sar -b -r -n DEV: Linux 5.4.0-190-generic (prd-ubuntu2004-docker-4c-16g-24687) 09/20/24 _x86_64_ (4 CPU) 20:32:18 LINUX RESTART (4 CPU) 20:33:02 tps rtps wtps dtps bread/s bwrtn/s bdscd/s 20:34:01 254.57 79.19 175.38 0.00 2499.64 8379.88 0.00 20:35:01 157.02 27.25 129.78 0.00 1956.61 21005.03 0.00 20:36:01 158.50 15.26 143.24 0.00 836.25 45586.00 0.00 20:37:01 122.45 0.87 121.58 0.00 43.19 52305.55 0.00 20:38:01 225.11 15.81 209.30 0.00 4895.57 160244.32 0.00 20:39:01 161.41 1.52 159.89 0.00 83.99 28385.00 0.00 20:40:01 114.53 2.67 111.86 0.00 179.01 2071.71 0.00 20:41:01 68.16 0.17 67.99 0.00 13.33 1101.37 0.00 20:42:01 127.50 2.53 124.96 0.00 459.12 10117.91 0.00 20:43:01 61.39 0.05 61.34 0.00 0.53 972.37 0.00 20:44:01 92.50 0.17 92.33 0.00 16.93 1567.87 0.00 20:45:01 57.02 0.08 56.94 0.00 4.53 821.06 0.00 20:46:01 2.55 0.00 2.55 0.00 0.00 41.06 0.00 20:47:01 74.19 0.47 73.72 0.00 10.80 1074.49 0.00 20:48:01 53.86 1.25 52.61 0.00 28.40 1079.02 0.00 20:49:01 18.18 0.18 17.99 0.00 11.86 721.76 0.00 20:50:01 2.68 0.00 2.68 0.00 0.00 39.19 0.00 20:51:01 5.55 0.02 5.53 0.00 0.13 217.16 0.00 20:52:01 97.57 2.87 94.70 0.00 73.03 9707.15 0.00 20:53:01 48.09 0.48 47.61 0.00 15.46 921.85 0.00 20:54:01 2.35 0.00 2.35 0.00 0.00 44.13 0.00 20:55:01 58.27 0.12 58.16 0.00 4.93 1154.47 0.00 20:56:01 2.77 0.00 2.77 0.00 0.00 54.66 0.00 20:57:01 24.79 0.02 24.78 0.00 0.53 394.67 0.00 20:58:01 49.23 0.00 49.23 0.00 0.00 1053.16 0.00 20:59:01 77.75 0.02 77.74 0.00 0.80 3075.22 0.00 21:00:01 68.39 0.02 68.37 0.00 0.27 1017.03 0.00 21:01:01 76.44 0.02 76.42 0.00 1.60 1219.99 0.00 21:02:01 2.88 0.00 2.88 0.00 0.00 56.39 0.00 21:03:01 2.40 0.00 2.40 0.00 0.00 36.39 0.00 21:04:01 1.38 0.00 1.38 0.00 0.00 17.86 0.00 21:05:01 1.80 0.00 1.80 0.00 0.00 23.73 0.00 21:06:01 72.44 0.02 72.42 0.00 0.40 1059.42 0.00 21:07:01 68.29 0.05 68.24 0.00 1.47 1002.50 0.00 21:08:01 1.83 0.00 1.83 0.00 0.00 30.79 0.00 21:09:01 98.38 0.03 98.35 0.00 1.47 1436.69 0.00 21:10:01 59.89 0.02 59.87 0.00 0.27 866.92 0.00 21:11:01 61.17 0.02 61.16 0.00 1.20 905.45 0.00 21:12:01 2.02 0.00 2.02 0.00 0.00 43.86 0.00 21:13:01 57.47 0.02 57.46 0.00 0.93 864.52 0.00 21:14:01 2.07 0.00 2.07 0.00 0.00 50.79 0.00 21:15:01 2.18 0.00 2.18 0.00 0.00 41.99 0.00 21:16:01 1.65 0.00 1.65 0.00 0.00 26.80 0.00 21:17:01 2.03 0.05 1.98 0.00 1.07 26.40 0.00 21:18:01 1.73 0.00 1.73 0.00 0.00 24.13 0.00 21:19:01 53.79 0.02 53.77 0.00 2.67 818.40 0.00 21:20:01 2.25 0.00 2.25 0.00 0.00 58.79 0.00 21:21:01 2.67 0.00 2.67 0.00 0.00 44.39 0.00 21:22:01 2.53 0.00 2.53 0.00 0.00 46.13 0.00 21:23:01 2.25 0.00 2.25 0.00 0.00 45.19 0.00 21:24:01 1.93 0.00 1.93 0.00 0.00 41.19 0.00 21:25:01 2.05 0.00 2.05 0.00 0.00 37.46 0.00 21:26:01 1.57 0.00 1.57 0.00 0.00 37.86 0.00 21:27:01 69.81 0.02 69.79 0.00 1.60 1030.59 0.00 21:28:01 2.50 0.00 2.50 0.00 0.00 69.72 0.00 21:29:01 2.88 0.00 2.88 0.00 0.00 62.12 0.00 21:30:01 1.57 0.00 1.57 0.00 0.00 33.06 0.00 21:31:01 2.45 0.00 2.45 0.00 0.00 55.06 0.00 21:32:01 1.95 0.00 1.95 0.00 0.00 47.33 0.00 21:33:02 1.93 0.00 1.93 0.00 0.00 53.59 0.00 21:34:01 16.73 0.02 16.71 0.00 3.12 299.20 0.00 21:35:01 53.99 0.07 53.92 0.00 7.47 774.40 0.00 21:36:01 1.92 0.00 1.92 0.00 0.00 51.32 0.00 21:37:01 2.35 0.00 2.35 0.00 0.00 50.79 0.00 21:38:01 1.83 0.00 1.83 0.00 0.00 45.33 0.00 21:39:01 3.22 0.00 3.22 0.00 0.00 58.66 0.00 21:40:01 1.80 0.00 1.80 0.00 0.00 41.33 0.00 21:41:01 2.43 0.00 2.43 0.00 0.00 53.06 0.00 21:42:01 15.76 0.02 15.75 0.00 3.07 365.67 0.00 21:43:01 69.87 0.00 69.87 0.00 0.00 993.97 0.00 21:44:01 2.37 0.00 2.37 0.00 0.00 63.85 0.00 21:45:01 2.90 0.00 2.90 0.00 0.00 53.59 0.00 21:46:01 2.23 0.00 2.23 0.00 0.00 50.26 0.00 21:47:01 2.63 0.00 2.63 0.00 0.00 47.33 0.00 21:48:01 2.13 0.00 2.13 0.00 0.00 39.73 0.00 21:49:01 2.60 0.00 2.60 0.00 0.00 51.19 0.00 21:50:01 2.52 0.00 2.52 0.00 0.00 47.45 0.00 21:51:01 2.20 0.00 2.20 0.00 0.00 43.59 0.00 21:52:01 7.87 0.05 7.82 0.00 1.47 448.06 0.00 21:53:01 97.58 0.12 97.47 0.00 1.73 9630.66 0.00 21:54:01 3.82 0.80 3.02 0.00 20.53 143.31 0.00 21:55:01 24.38 0.02 24.36 0.00 3.20 603.23 0.00 21:56:01 46.99 0.00 46.99 0.00 0.00 678.15 0.00 21:57:01 2.42 0.00 2.42 0.00 0.00 40.66 0.00 21:58:01 2.22 0.07 2.15 0.00 0.67 41.73 0.00 21:59:01 3.25 0.00 3.25 0.00 0.00 49.86 0.00 22:00:01 1.88 0.00 1.88 0.00 0.00 37.86 0.00 22:01:01 1.97 0.02 1.95 0.00 0.13 41.05 0.00 22:02:01 22.35 0.02 22.33 0.00 1.60 367.60 0.00 22:03:01 51.36 0.00 51.36 0.00 0.00 749.34 0.00 22:04:01 3.37 0.15 3.22 0.00 4.13 171.44 0.00 22:05:01 3.25 0.00 3.25 0.00 0.00 71.98 0.00 22:06:01 5.92 0.00 5.92 0.00 0.00 277.55 0.00 Average: 35.69 1.63 34.07 0.00 119.98 4084.65 0.00 20:33:02 kbmemfree kbavail kbmemused %memused kbbuffers kbcached kbcommit %commit kbactive kbinact kbdirty 20:34:01 13411004 15440580 538164 3.29 69076 2157908 1257808 7.22 793492 1899896 146260 20:35:01 11214732 14643404 1312580 8.01 119684 3408412 2165616 12.42 1800884 2966600 990904 20:36:01 9969216 13978556 1976192 12.06 145684 3917384 2565664 14.72 2487912 3476648 210280 20:37:01 6518312 13145768 2807480 17.14 187096 6382768 3632024 20.84 3843452 5456656 1289268 20:38:01 3871072 13201544 2740372 16.73 224564 8937528 3608108 20.70 4811648 7009784 275936 20:39:01 1259256 10694296 5245400 32.02 229688 9033424 6785584 38.93 7507800 6908916 720 20:40:01 203896 8609764 7328808 44.74 228708 8021500 8344956 47.88 9423492 6055596 1656 20:41:01 1051936 9462744 6475908 39.53 231844 8023108 7508608 43.08 8587664 6045912 704 20:42:01 167724 8295564 7642464 46.65 241368 7732744 8963300 51.42 9767864 5748940 716 20:43:01 164672 7308536 8628988 52.68 243392 6762896 9776764 56.09 10677692 4854988 52 20:44:01 1857440 8825032 7113060 43.42 245424 6589120 8149932 46.76 9178652 4670712 492 20:45:01 165600 6138440 9798304 59.81 247128 5618768 11094840 63.65 11740220 3817764 160 20:46:01 163300 5965476 9971244 60.87 247160 5453108 11143528 63.93 11876192 3689624 148 20:47:01 1463108 7267876 8669444 52.92 248980 5453632 9720928 55.77 10582036 3687664 200 20:48:01 6818144 12624444 3316080 20.24 249476 5454572 4747948 27.24 5257052 3677896 492 20:49:01 2772456 8580564 7357544 44.91 250028 5455800 8649792 49.63 9293116 3675688 208 20:50:01 2758156 8566328 7371816 45.00 250068 5455844 8665996 49.72 9307796 3675624 76 20:51:01 6472264 12315968 3623204 22.12 251464 5490016 4976644 28.55 5582104 3698596 36252 20:52:01 5728404 11774384 4165124 25.43 257180 5684352 5571084 31.96 6198024 3822516 1712 20:53:01 4690500 10738032 5201336 31.75 258096 5685056 6472408 37.13 7240664 3813716 320 20:54:01 4671416 10719100 5220260 31.87 258112 5685192 6488404 37.23 7259748 3813828 164 20:55:01 3886620 9935248 6003872 36.65 258740 5685488 7270580 41.71 8052612 3802444 596 20:56:01 3823176 9872052 6066880 37.04 258752 5685724 7334960 42.08 8115532 3802596 252 20:57:01 5070464 11119872 4819520 29.42 258996 5686012 6348112 36.42 6874160 3801736 544 20:58:01 6149800 12234940 3704128 22.61 261212 5717340 5021444 28.81 5766336 3831588 25788 20:59:01 6295536 12414384 3525864 21.52 263252 5747160 4415784 25.33 5592192 3860640 636 21:00:01 7045616 13165528 2775604 16.94 263976 5747492 3627516 20.81 4850520 3855636 256 21:01:01 5805180 11926176 4014212 24.50 264584 5747936 5103148 29.28 6094684 3847376 188 21:02:01 5487404 11608632 4331632 26.44 264596 5748156 5183812 29.74 6411600 3847596 44 21:03:01 5486072 11607320 4332932 26.45 264596 5748176 5183812 29.74 6413224 3847612 36 21:04:01 5485964 11607220 4333008 26.45 264604 5748180 5183812 29.74 6412864 3847608 64 21:05:01 5485168 11606432 4333768 26.46 264604 5748180 5183812 29.74 6412960 3847608 52 21:06:01 7696836 13819004 2122280 12.96 265268 5748408 2946304 16.90 4210440 3846560 112 21:07:01 7725800 13849524 2091664 12.77 266432 5748760 2888212 16.57 4181472 3846408 292 21:08:01 7709576 13833340 2107780 12.87 266448 5748784 2904240 16.66 4198660 3846412 72 21:09:01 8833160 14958192 983676 6.00 267228 5749236 1816832 10.42 3079296 3846556 544 21:10:01 7619004 13744920 2196308 13.41 267668 5749652 2973732 17.06 4287396 3846724 216 21:11:01 7015440 13142148 2798764 17.09 268056 5750048 3668424 21.05 4889164 3846612 240 21:12:01 6807472 12934416 3006228 18.35 268060 5750276 3766544 21.61 5096180 3846804 104 21:13:01 5683756 11811496 4128568 25.20 268364 5750760 5082456 29.16 6214384 3847160 656 21:14:01 5481548 11609536 4330228 26.43 268364 5751008 5164816 29.63 6416016 3847404 396 21:15:01 5456640 11585028 4354720 26.58 268376 5751408 5180804 29.72 6439956 3847792 396 21:16:01 5456720 11585136 4354656 26.58 268388 5751412 5180804 29.72 6439700 3847808 120 21:17:01 5456028 11584460 4355440 26.59 268404 5751408 5180804 29.72 6439852 3847808 260 21:18:01 5454680 11583212 4356580 26.59 268412 5751468 5180804 29.72 6441564 3847856 168 21:19:01 5590188 11719672 4220332 25.76 268920 5751900 5098696 29.25 6307516 3848188 416 21:20:01 5428588 11558464 4381212 26.75 268928 5752280 5196528 29.81 6467500 3848332 188 21:21:01 5420556 11550544 4389132 26.79 268936 5752380 5212524 29.91 6474840 3848432 324 21:22:01 5400680 11530832 4408816 26.91 268948 5752536 5228552 30.00 6494636 3848580 156 21:23:01 5384552 11514904 4424720 27.01 268956 5752728 5228552 30.00 6510488 3848776 168 21:24:01 5363324 11494160 4445348 27.14 268964 5753208 5244568 30.09 6530288 3849236 264 21:25:01 5314596 11445544 4494036 27.43 268968 5753308 5278264 30.28 6579428 3849340 232 21:26:01 5307068 11438284 4501376 27.48 268976 5753564 5294716 30.38 6586288 3849596 436 21:27:01 5704120 11835944 4104076 25.05 269380 5753756 5069592 29.09 6189440 3849608 384 21:28:01 5479948 11612388 4327220 26.42 269380 5754372 5199476 29.83 6413704 3850224 320 21:29:01 5446724 11579656 4359944 26.62 269392 5754852 5215480 29.92 6446292 3850704 136 21:30:01 5432864 11566180 4373416 26.70 269400 5755228 5215480 29.92 6459276 3851076 480 21:31:01 5396740 11530520 4409052 26.92 269400 5755688 5215480 29.92 6494984 3851536 548 21:32:01 5383708 11517924 4421596 26.99 269400 5756144 5231476 30.01 6508116 3851972 212 21:33:02 5337772 11472780 4466724 27.27 269400 5756920 5280836 30.30 6552620 3852764 580 21:34:01 7996380 14131880 1809336 11.05 269404 5757400 2916944 16.74 3905992 3853188 460 21:35:01 4056340 10191952 5747116 35.08 269596 5757300 6621724 37.99 7830776 3852760 572 21:36:01 3870252 10006252 5932604 36.22 269600 5757684 6736396 38.65 8015200 3853136 448 21:37:01 3785956 9922424 6016464 36.73 269600 5758148 6801872 39.02 8098184 3853604 624 21:38:01 3769624 9906224 6032632 36.83 269604 5758276 6801872 39.02 8113892 3853732 348 21:39:01 3724664 9861616 6077120 37.10 269612 5758620 6817920 39.12 8158364 3854060 176 21:40:01 3717772 9854932 6083720 37.14 269612 5758808 6833912 39.21 8165328 3854264 144 21:41:01 3701384 9838956 6099744 37.24 269616 5759212 6833912 39.21 8181320 3854664 112 21:42:01 8799924 14937784 1003760 6.13 269628 5759456 1826448 10.48 3104364 3854888 388 21:43:01 4172080 10310176 5628868 34.36 269864 5759456 6570512 37.70 7713628 3854728 424 21:44:01 3951876 10090288 5848548 35.70 269864 5759772 6635544 38.07 7932424 3855044 260 21:45:01 3701628 9840704 6097884 37.22 269868 5760428 6798424 39.00 8180536 3855700 504 21:46:01 3690140 9829348 6109224 37.29 269868 5760560 6798424 39.00 8191932 3855832 212 21:47:01 3681036 9820356 6118176 37.35 269872 5760664 6798424 39.00 8200316 3855936 160 21:48:01 3666232 9805736 6132796 37.44 269872 5760848 6798424 39.00 8214764 3856120 112 21:49:01 3658544 9798260 6140272 37.48 269872 5761060 6798424 39.00 8220884 3856332 332 21:50:01 3634532 9774420 6163972 37.63 269872 5761240 6814412 39.10 8245788 3856504 108 21:51:01 3600172 9740312 6198132 37.84 269880 5761476 6863484 39.38 8279232 3856752 332 21:52:01 9182596 15389128 552424 3.37 272020 5821424 1382064 7.93 2657320 3915796 60788 21:53:01 5110388 11490844 4448324 27.15 276260 5988464 5256960 30.16 6599884 4026000 3412 21:54:01 4995396 11376572 4562584 27.85 276260 5989228 5366584 30.79 6714680 4026008 432 21:55:01 6279032 12660540 3279324 20.02 276316 5989484 4823540 27.67 5455292 4007840 308 21:56:01 5177800 11559888 4379336 26.73 276460 5989920 5165104 29.63 6554468 4004620 28 21:57:01 5166376 11548656 4390516 26.80 276468 5990108 5181104 29.73 6565856 4004748 256 21:58:01 5156460 11538924 4400352 26.86 276476 5990280 5181104 29.73 6574984 4004928 292 21:59:01 5148948 11531728 4407508 26.91 276476 5990608 5181104 29.73 6582824 4005208 300 22:00:01 5118580 11501636 4437488 27.09 276476 5990872 5213536 29.91 6612552 4005484 108 22:01:01 5089064 11472448 4466636 27.27 276480 5991196 5213536 29.91 6640428 4005788 68 22:02:01 6965616 13349080 2591048 15.82 276504 5991252 4094844 23.49 4784432 3994596 540 22:03:01 4397448 10781204 5157504 31.48 276636 5991396 6021532 34.55 7341960 3994584 136 22:04:01 4279352 10663796 5274800 32.20 276636 5992080 6107360 35.04 7458568 3995040 320 22:05:01 4211596 10596772 5341784 32.61 276640 5992808 6123392 35.13 7525168 3995760 140 22:06:01 8895944 15340636 600448 3.67 278420 6047224 1441096 8.27 2811436 4041724 56244 Average: 5113493 11325440 4614941 28.17 259017 5848293 5538835 31.78 6636521 3998233 33577 20:33:02 IFACE rxpck/s txpck/s rxkB/s txkB/s rxcmp/s txcmp/s rxmcst/s %ifutil 20:34:01 ens3 185.38 143.31 1173.87 37.40 0.00 0.00 0.00 0.00 20:34:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:34:01 lo 0.75 0.75 0.07 0.07 0.00 0.00 0.00 0.00 20:35:01 ens3 290.38 236.51 4425.65 27.98 0.00 0.00 0.00 0.00 20:35:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:35:01 lo 3.93 3.93 0.38 0.38 0.00 0.00 0.00 0.00 20:36:01 ens3 371.24 318.26 5909.35 32.87 0.00 0.00 0.00 0.00 20:36:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:36:01 lo 3.00 3.00 0.32 0.32 0.00 0.00 0.00 0.00 20:37:01 ens3 310.08 198.12 5059.44 19.25 0.00 0.00 0.00 0.00 20:37:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:37:01 lo 1.20 1.20 0.11 0.11 0.00 0.00 0.00 0.00 20:38:01 ens3 230.67 159.18 2753.84 11.97 0.00 0.00 0.00 0.00 20:38:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:38:01 lo 2.25 2.25 0.21 0.21 0.00 0.00 0.00 0.00 20:39:01 ens3 1.45 1.20 0.18 0.16 0.00 0.00 0.00 0.00 20:39:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:39:01 lo 7.35 7.35 13.87 13.87 0.00 0.00 0.00 0.00 20:40:01 ens3 1.85 1.53 0.30 0.25 0.00 0.00 0.00 0.00 20:40:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:40:01 lo 48.62 48.62 37.15 37.15 0.00 0.00 0.00 0.00 20:41:01 ens3 1.82 1.70 0.53 0.46 0.00 0.00 0.00 0.00 20:41:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:41:01 lo 35.50 35.50 18.32 18.32 0.00 0.00 0.00 0.00 20:42:01 ens3 1.93 2.03 0.82 0.72 0.00 0.00 0.00 0.00 20:42:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:42:01 lo 8.33 8.33 3.24 3.24 0.00 0.00 0.00 0.00 20:43:01 ens3 1.12 0.90 0.29 0.21 0.00 0.00 0.00 0.00 20:43:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:43:01 lo 12.18 12.18 10.44 10.44 0.00 0.00 0.00 0.00 20:44:01 ens3 0.98 0.75 0.14 0.13 0.00 0.00 0.00 0.00 20:44:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:44:01 lo 17.76 17.76 5.76 5.76 0.00 0.00 0.00 0.00 20:45:01 ens3 0.95 0.82 0.15 0.14 0.00 0.00 0.00 0.00 20:45:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:45:01 lo 14.75 14.75 7.75 7.75 0.00 0.00 0.00 0.00 20:46:01 ens3 1.23 0.98 0.23 0.21 0.00 0.00 0.00 0.00 20:46:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:46:01 lo 17.41 17.41 9.15 9.15 0.00 0.00 0.00 0.00 20:47:01 ens3 1.68 1.08 0.45 0.34 0.00 0.00 0.00 0.00 20:47:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:47:01 lo 14.63 14.63 6.32 6.32 0.00 0.00 0.00 0.00 20:48:01 ens3 1.48 1.18 0.33 0.25 0.00 0.00 0.00 0.00 20:48:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:48:01 lo 13.16 13.16 2.71 2.71 0.00 0.00 0.00 0.00 20:49:01 ens3 0.77 0.63 0.10 0.09 0.00 0.00 0.00 0.00 20:49:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:49:01 lo 19.48 19.48 20.43 20.43 0.00 0.00 0.00 0.00 20:50:01 ens3 0.65 0.48 0.11 0.10 0.00 0.00 0.00 0.00 20:50:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:50:01 lo 2.82 2.82 3.41 3.41 0.00 0.00 0.00 0.00 20:51:01 ens3 1.93 2.65 0.88 1.06 0.00 0.00 0.00 0.00 20:51:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:51:01 lo 3.53 3.53 0.43 0.43 0.00 0.00 0.00 0.00 20:52:01 ens3 1.13 1.08 0.20 0.19 0.00 0.00 0.00 0.00 20:52:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:52:01 lo 13.53 13.53 21.59 21.59 0.00 0.00 0.00 0.00 20:53:01 ens3 1.23 1.40 0.31 0.26 0.00 0.00 0.00 0.00 20:53:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:53:01 lo 19.26 19.26 9.24 9.24 0.00 0.00 0.00 0.00 20:54:01 ens3 1.12 1.48 0.23 0.24 0.00 0.00 0.00 0.00 20:54:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:54:01 lo 25.58 25.58 8.51 8.51 0.00 0.00 0.00 0.00 20:55:01 ens3 1.87 2.35 0.35 0.37 0.00 0.00 0.00 0.00 20:55:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:55:01 lo 17.11 17.11 7.04 7.04 0.00 0.00 0.00 0.00 20:56:01 ens3 1.35 1.32 0.28 0.26 0.00 0.00 0.00 0.00 20:56:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:56:01 lo 38.89 38.89 15.42 15.42 0.00 0.00 0.00 0.00 20:57:01 ens3 1.12 0.95 0.20 0.18 0.00 0.00 0.00 0.00 20:57:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:57:01 lo 21.74 21.74 6.57 6.57 0.00 0.00 0.00 0.00 20:58:01 ens3 2.27 2.83 1.02 0.88 0.00 0.00 0.00 0.00 20:58:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:58:01 lo 26.40 26.40 11.59 11.59 0.00 0.00 0.00 0.00 20:59:01 ens3 15.05 13.35 3.15 8.96 0.00 0.00 0.00 0.00 20:59:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 20:59:01 lo 13.43 13.43 9.66 9.66 0.00 0.00 0.00 0.00 21:00:01 ens3 1.00 0.95 0.19 0.18 0.00 0.00 0.00 0.00 21:00:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:00:01 lo 14.56 14.56 4.90 4.90 0.00 0.00 0.00 0.00 21:01:01 ens3 0.87 0.75 0.12 0.11 0.00 0.00 0.00 0.00 21:01:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:01:01 lo 10.61 10.61 4.23 4.23 0.00 0.00 0.00 0.00 21:02:01 ens3 1.17 1.07 0.23 0.21 0.00 0.00 0.00 0.00 21:02:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:02:01 lo 20.83 20.83 9.28 9.28 0.00 0.00 0.00 0.00 21:03:01 ens3 0.67 0.63 0.20 0.14 0.00 0.00 0.00 0.00 21:03:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:03:01 lo 1.87 1.87 0.71 0.71 0.00 0.00 0.00 0.00 21:04:01 ens3 0.08 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:04:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:04:01 lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:05:01 ens3 0.18 0.12 0.01 0.01 0.00 0.00 0.00 0.00 21:05:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:05:01 lo 0.20 0.20 0.01 0.01 0.00 0.00 0.00 0.00 21:06:01 ens3 0.62 0.52 0.09 0.08 0.00 0.00 0.00 0.00 21:06:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:06:01 lo 4.68 4.68 5.71 5.71 0.00 0.00 0.00 0.00 21:07:01 ens3 0.88 0.97 0.14 0.14 0.00 0.00 0.00 0.00 21:07:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:07:01 lo 8.72 8.72 3.32 3.32 0.00 0.00 0.00 0.00 21:08:01 ens3 0.72 0.78 0.22 0.16 0.00 0.00 0.00 0.00 21:08:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:08:01 lo 3.78 3.78 0.94 0.94 0.00 0.00 0.00 0.00 21:09:01 ens3 0.90 1.08 0.15 0.16 0.00 0.00 0.00 0.00 21:09:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:09:01 lo 30.26 30.26 9.78 9.78 0.00 0.00 0.00 0.00 21:10:01 ens3 0.78 0.85 0.13 0.13 0.00 0.00 0.00 0.00 21:10:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:10:01 lo 10.76 10.76 9.50 9.50 0.00 0.00 0.00 0.00 21:11:01 ens3 0.80 0.93 0.13 0.14 0.00 0.00 0.00 0.00 21:11:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:11:01 lo 15.18 15.18 9.61 9.61 0.00 0.00 0.00 0.00 21:12:01 ens3 0.90 1.10 0.17 0.17 0.00 0.00 0.00 0.00 21:12:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:12:01 lo 20.01 20.01 15.38 15.38 0.00 0.00 0.00 0.00 21:13:01 ens3 1.18 1.13 0.28 0.22 0.00 0.00 0.00 0.00 21:13:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:13:01 lo 20.51 20.51 8.48 8.48 0.00 0.00 0.00 0.00 21:14:01 ens3 0.67 0.53 0.12 0.11 0.00 0.00 0.00 0.00 21:14:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:14:01 lo 32.61 32.61 11.43 11.43 0.00 0.00 0.00 0.00 21:15:01 ens3 0.55 0.43 0.09 0.08 0.00 0.00 0.00 0.00 21:15:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:15:01 lo 38.74 38.74 10.95 10.95 0.00 0.00 0.00 0.00 21:16:01 ens3 0.12 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:16:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:16:01 lo 0.40 0.40 0.03 0.03 0.00 0.00 0.00 0.00 21:17:01 ens3 0.30 0.10 0.02 0.01 0.00 0.00 0.00 0.00 21:17:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:17:01 lo 0.80 0.80 0.08 0.08 0.00 0.00 0.00 0.00 21:18:01 ens3 0.73 0.32 0.20 0.09 0.00 0.00 0.00 0.00 21:18:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:18:01 lo 3.85 3.85 1.39 1.39 0.00 0.00 0.00 0.00 21:19:01 ens3 1.28 1.12 0.38 0.32 0.00 0.00 0.00 0.00 21:19:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:19:01 lo 16.83 16.83 15.28 15.28 0.00 0.00 0.00 0.00 21:20:01 ens3 0.77 0.65 0.14 0.13 0.00 0.00 0.00 0.00 21:20:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:20:01 lo 21.03 21.03 10.29 10.29 0.00 0.00 0.00 0.00 21:21:01 ens3 0.63 0.50 0.10 0.09 0.00 0.00 0.00 0.00 21:21:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:21:01 lo 8.05 8.05 4.79 4.79 0.00 0.00 0.00 0.00 21:22:01 ens3 0.55 0.48 0.11 0.10 0.00 0.00 0.00 0.00 21:22:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:22:01 lo 17.38 17.38 8.78 8.78 0.00 0.00 0.00 0.00 21:23:01 ens3 1.43 0.83 0.48 0.36 0.00 0.00 0.00 0.00 21:23:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:23:01 lo 19.95 19.95 7.36 7.36 0.00 0.00 0.00 0.00 21:24:01 ens3 0.48 0.32 0.12 0.06 0.00 0.00 0.00 0.00 21:24:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:24:01 lo 22.46 22.46 9.96 9.96 0.00 0.00 0.00 0.00 21:25:01 ens3 1.22 0.70 0.40 0.29 0.00 0.00 0.00 0.00 21:25:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:25:01 lo 13.40 13.40 5.46 5.46 0.00 0.00 0.00 0.00 21:26:01 ens3 0.43 0.35 0.07 0.07 0.00 0.00 0.00 0.00 21:26:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:26:01 lo 23.80 23.80 9.03 9.03 0.00 0.00 0.00 0.00 21:27:01 ens3 1.02 0.85 0.15 0.14 0.00 0.00 0.00 0.00 21:27:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:27:01 lo 18.36 18.36 8.18 8.18 0.00 0.00 0.00 0.00 21:28:01 ens3 0.82 0.62 0.20 0.17 0.00 0.00 0.00 0.00 21:28:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:28:01 lo 37.51 37.51 12.32 12.32 0.00 0.00 0.00 0.00 21:29:01 ens3 0.65 0.45 0.14 0.08 0.00 0.00 0.00 0.00 21:29:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:29:01 lo 46.46 46.46 13.41 13.41 0.00 0.00 0.00 0.00 21:30:01 ens3 0.52 0.42 0.10 0.09 0.00 0.00 0.00 0.00 21:30:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:30:01 lo 18.58 18.58 6.23 6.23 0.00 0.00 0.00 0.00 21:31:01 ens3 0.40 0.30 0.06 0.05 0.00 0.00 0.00 0.00 21:31:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:31:01 lo 32.98 32.98 9.65 9.65 0.00 0.00 0.00 0.00 21:32:01 ens3 1.15 0.32 0.33 0.22 0.00 0.00 0.00 0.00 21:32:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:32:01 lo 32.46 32.46 9.11 9.11 0.00 0.00 0.00 0.00 21:33:02 ens3 1.40 0.48 0.51 0.31 0.00 0.00 0.00 0.00 21:33:02 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:33:02 lo 55.11 55.11 17.74 17.74 0.00 0.00 0.00 0.00 21:34:01 ens3 1.10 0.64 0.39 0.30 0.00 0.00 0.00 0.00 21:34:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:34:01 lo 29.03 29.03 7.96 7.96 0.00 0.00 0.00 0.00 21:35:01 ens3 0.82 0.67 0.11 0.09 0.00 0.00 0.00 0.00 21:35:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:35:01 lo 22.61 22.61 22.56 22.56 0.00 0.00 0.00 0.00 21:36:01 ens3 0.83 0.60 0.14 0.12 0.00 0.00 0.00 0.00 21:36:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:36:01 lo 31.89 31.89 13.42 13.42 0.00 0.00 0.00 0.00 21:37:01 ens3 0.58 0.50 0.10 0.09 0.00 0.00 0.00 0.00 21:37:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:37:01 lo 19.65 19.65 7.96 7.96 0.00 0.00 0.00 0.00 21:38:01 ens3 0.98 0.73 0.27 0.19 0.00 0.00 0.00 0.00 21:38:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:38:01 lo 10.73 10.73 4.60 4.60 0.00 0.00 0.00 0.00 21:39:01 ens3 0.65 0.60 0.12 0.11 0.00 0.00 0.00 0.00 21:39:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:39:01 lo 37.91 37.91 14.61 14.61 0.00 0.00 0.00 0.00 21:40:01 ens3 0.58 0.43 0.10 0.09 0.00 0.00 0.00 0.00 21:40:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:40:01 lo 19.10 19.10 6.44 6.44 0.00 0.00 0.00 0.00 21:41:01 ens3 0.58 0.37 0.08 0.06 0.00 0.00 0.00 0.00 21:41:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:41:01 lo 20.76 20.76 8.77 8.77 0.00 0.00 0.00 0.00 21:42:01 ens3 0.92 0.63 0.15 0.12 0.00 0.00 0.00 0.00 21:42:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:42:01 lo 20.50 20.50 8.14 8.14 0.00 0.00 0.00 0.00 21:43:01 ens3 0.97 0.75 0.22 0.15 0.00 0.00 0.00 0.00 21:43:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:43:01 lo 14.81 14.81 19.44 19.44 0.00 0.00 0.00 0.00 21:44:01 ens3 0.83 0.62 0.15 0.13 0.00 0.00 0.00 0.00 21:44:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:44:01 lo 27.92 27.92 10.04 10.04 0.00 0.00 0.00 0.00 21:45:01 ens3 0.57 0.37 0.08 0.06 0.00 0.00 0.00 0.00 21:45:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:45:01 lo 31.18 31.18 15.85 15.85 0.00 0.00 0.00 0.00 21:46:01 ens3 0.47 0.38 0.08 0.07 0.00 0.00 0.00 0.00 21:46:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:46:01 lo 8.80 8.80 5.76 5.76 0.00 0.00 0.00 0.00 21:47:01 ens3 0.88 0.70 0.15 0.13 0.00 0.00 0.00 0.00 21:47:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:47:01 lo 15.68 15.68 8.14 8.14 0.00 0.00 0.00 0.00 21:48:01 ens3 0.60 0.43 0.21 0.13 0.00 0.00 0.00 0.00 21:48:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:48:01 lo 15.21 15.21 8.77 8.77 0.00 0.00 0.00 0.00 21:49:01 ens3 0.53 0.40 0.08 0.07 0.00 0.00 0.00 0.00 21:49:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:49:01 lo 11.45 11.45 5.95 5.95 0.00 0.00 0.00 0.00 21:50:01 ens3 0.63 0.50 0.11 0.10 0.00 0.00 0.00 0.00 21:50:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:50:01 lo 24.78 24.78 12.82 12.82 0.00 0.00 0.00 0.00 21:51:01 ens3 0.38 0.18 0.04 0.03 0.00 0.00 0.00 0.00 21:51:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:51:01 lo 22.53 22.53 8.83 8.83 0.00 0.00 0.00 0.00 21:52:01 ens3 1.85 2.00 0.84 0.74 0.00 0.00 0.00 0.00 21:52:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:52:01 lo 21.01 21.01 7.61 7.61 0.00 0.00 0.00 0.00 21:53:01 ens3 0.98 0.90 0.22 0.16 0.00 0.00 0.00 0.00 21:53:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:53:01 lo 36.78 36.78 32.80 32.80 0.00 0.00 0.00 0.00 21:54:01 ens3 1.32 1.77 0.27 0.29 0.00 0.00 0.00 0.00 21:54:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:54:01 lo 19.38 19.38 9.79 9.79 0.00 0.00 0.00 0.00 21:55:01 ens3 0.78 0.80 0.10 0.10 0.00 0.00 0.00 0.00 21:55:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:55:01 lo 20.86 20.86 5.42 5.42 0.00 0.00 0.00 0.00 21:56:01 ens3 0.88 1.12 0.17 0.18 0.00 0.00 0.00 0.00 21:56:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:56:01 lo 37.98 37.98 19.35 19.35 0.00 0.00 0.00 0.00 21:57:01 ens3 0.80 1.00 0.15 0.16 0.00 0.00 0.00 0.00 21:57:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:57:01 lo 12.33 12.33 5.32 5.32 0.00 0.00 0.00 0.00 21:58:01 ens3 0.62 0.52 0.19 0.12 0.00 0.00 0.00 0.00 21:58:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:58:01 lo 20.61 20.61 7.67 7.67 0.00 0.00 0.00 0.00 21:59:01 ens3 0.88 1.13 0.17 0.18 0.00 0.00 0.00 0.00 21:59:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 21:59:01 lo 18.76 18.76 7.59 7.59 0.00 0.00 0.00 0.00 22:00:01 ens3 0.65 0.77 0.12 0.12 0.00 0.00 0.00 0.00 22:00:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 22:00:01 lo 28.70 28.70 9.52 9.52 0.00 0.00 0.00 0.00 22:01:01 ens3 0.68 0.83 0.13 0.13 0.00 0.00 0.00 0.00 22:01:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 22:01:01 lo 21.46 21.46 7.69 7.69 0.00 0.00 0.00 0.00 22:02:01 ens3 0.87 0.92 0.14 0.14 0.00 0.00 0.00 0.00 22:02:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 22:02:01 lo 37.85 37.85 11.57 11.57 0.00 0.00 0.00 0.00 22:03:01 ens3 0.95 1.03 0.25 0.19 0.00 0.00 0.00 0.00 22:03:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 22:03:01 lo 41.66 41.66 20.95 20.95 0.00 0.00 0.00 0.00 22:04:01 ens3 1.38 0.72 0.23 0.15 0.00 0.00 0.00 0.00 22:04:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 22:04:01 lo 41.28 41.28 15.15 15.15 0.00 0.00 0.00 0.00 22:05:01 ens3 0.95 0.75 0.39 0.33 0.00 0.00 0.00 0.00 22:05:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 22:05:01 lo 59.56 59.56 20.86 20.86 0.00 0.00 0.00 0.00 22:06:01 ens3 1.57 1.70 0.77 0.67 0.00 0.00 0.00 0.00 22:06:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 22:06:01 lo 73.85 73.85 23.77 23.77 0.00 0.00 0.00 0.00 Average: ens3 15.94 12.26 207.88 1.66 0.00 0.00 0.00 0.00 Average: docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 Average: lo 20.45 20.45 9.44 9.44 0.00 0.00 0.00 0.00 ---> sar -P ALL: Linux 5.4.0-190-generic (prd-ubuntu2004-docker-4c-16g-24687) 09/20/24 _x86_64_ (4 CPU) 20:32:18 LINUX RESTART (4 CPU) 20:33:02 CPU %user %nice %system %iowait %steal %idle 20:34:01 all 15.83 11.20 9.99 3.37 0.09 59.52 20:34:01 0 12.06 11.62 9.28 1.67 0.07 65.30 20:34:01 1 23.69 11.85 11.85 2.79 0.10 49.73 20:34:01 2 12.17 11.88 10.32 3.48 0.09 62.07 20:34:01 3 15.39 9.47 8.53 5.55 0.10 60.96 20:35:01 all 43.04 0.00 3.91 4.27 0.09 48.68 20:35:01 0 46.90 0.00 3.87 2.63 0.08 46.52 20:35:01 1 45.85 0.00 3.89 3.23 0.13 46.89 20:35:01 2 48.33 0.00 4.08 4.11 0.08 43.39 20:35:01 3 31.09 0.00 3.81 7.11 0.07 57.93 20:36:01 all 81.70 0.00 3.16 2.12 0.12 12.89 20:36:01 0 76.29 0.00 2.59 0.62 0.12 20.38 20:36:01 1 75.07 0.00 3.80 5.30 0.13 15.69 20:36:01 2 91.54 0.00 1.83 0.42 0.10 6.11 20:36:01 3 83.86 0.00 4.44 2.17 0.13 9.40 20:37:01 all 65.11 0.00 3.88 2.36 0.10 28.55 20:37:01 0 77.81 0.00 4.22 2.18 0.12 15.68 20:37:01 1 59.88 0.00 4.11 1.88 0.10 34.03 20:37:01 2 65.46 0.00 4.00 2.90 0.08 27.55 20:37:01 3 57.33 0.00 3.20 2.48 0.08 36.91 20:38:01 all 86.29 0.00 5.16 5.03 0.12 3.40 20:38:01 0 82.79 0.00 4.95 8.86 0.10 3.30 20:38:01 1 87.63 0.00 4.63 3.57 0.14 4.04 20:38:01 2 86.53 0.00 5.24 5.17 0.12 2.94 20:38:01 3 88.21 0.00 5.84 2.52 0.12 3.32 20:39:01 all 81.81 0.00 2.85 0.13 0.10 15.11 20:39:01 0 83.36 0.00 2.92 0.05 0.08 13.59 20:39:01 1 81.59 0.00 2.47 0.07 0.08 15.79 20:39:01 2 81.97 0.00 2.81 0.15 0.10 14.97 20:39:01 3 80.32 0.00 3.23 0.25 0.12 16.08 20:40:01 all 56.31 0.00 1.79 0.44 0.09 41.36 20:40:01 0 57.31 0.00 1.83 0.65 0.10 40.11 20:40:01 1 54.50 0.00 1.70 0.15 0.08 43.57 20:40:01 2 59.97 0.00 1.88 0.77 0.08 37.30 20:40:01 3 53.48 0.00 1.75 0.20 0.10 44.46 20:41:01 all 26.34 0.00 1.15 0.34 0.10 72.07 20:41:01 0 26.50 0.00 1.12 0.34 0.10 71.94 20:41:01 1 24.20 0.00 1.26 0.00 0.10 74.44 20:41:01 2 28.09 0.00 0.91 0.12 0.10 70.78 20:41:01 3 26.56 0.00 1.32 0.90 0.08 71.13 20:42:01 all 46.71 0.00 1.68 0.94 0.10 50.57 20:42:01 0 45.63 0.00 1.70 0.20 0.10 52.37 20:42:01 1 52.26 0.00 1.97 0.86 0.10 44.81 20:42:01 2 43.00 0.00 1.48 0.64 0.10 54.79 20:42:01 3 45.97 0.00 1.58 2.05 0.10 50.31 20:43:01 all 12.94 0.00 0.70 0.27 0.10 85.99 20:43:01 0 16.31 0.00 0.87 0.38 0.10 82.34 20:43:01 1 10.69 0.00 0.85 0.47 0.10 87.89 20:43:01 2 11.30 0.00 0.64 0.05 0.12 87.89 20:43:01 3 13.49 0.00 0.42 0.18 0.10 85.81 20:44:01 all 48.47 0.00 1.81 0.23 0.10 49.39 20:44:01 0 50.91 0.00 1.16 0.00 0.10 47.83 20:44:01 1 48.59 0.00 2.39 0.08 0.10 48.83 20:44:01 2 49.46 0.00 1.76 0.37 0.10 48.31 20:44:01 3 44.93 0.00 1.92 0.45 0.10 52.60 20:44:01 CPU %user %nice %system %iowait %steal %idle 20:45:01 all 29.08 0.00 1.14 0.32 0.10 69.36 20:45:01 0 31.54 0.00 1.14 0.05 0.12 67.15 20:45:01 1 27.02 0.00 1.06 0.00 0.10 71.82 20:45:01 2 31.53 0.00 1.22 0.00 0.08 67.17 20:45:01 3 26.24 0.00 1.13 1.23 0.10 71.30 20:46:01 all 3.67 0.00 0.26 0.02 0.11 95.94 20:46:01 0 5.19 0.00 0.24 0.00 0.10 94.48 20:46:01 1 2.44 0.00 0.30 0.00 0.12 97.14 20:46:01 2 3.78 0.00 0.23 0.00 0.10 95.89 20:46:01 3 3.27 0.00 0.27 0.07 0.12 96.27 20:47:01 all 35.25 0.00 1.16 0.28 0.12 63.19 20:47:01 0 33.76 0.00 1.04 0.03 0.12 65.05 20:47:01 1 34.73 0.00 1.33 0.13 0.12 63.69 20:47:01 2 36.82 0.00 1.21 0.34 0.13 61.51 20:47:01 3 35.70 0.00 1.07 0.60 0.10 62.52 20:48:01 all 72.59 0.00 2.69 0.10 0.08 24.54 20:48:01 0 73.36 0.00 2.91 0.03 0.08 23.61 20:48:01 1 71.94 0.00 2.24 0.15 0.08 25.58 20:48:01 2 73.07 0.00 2.35 0.03 0.08 24.46 20:48:01 3 71.99 0.00 3.25 0.18 0.08 24.49 20:49:01 all 52.54 0.00 1.74 0.21 0.10 45.41 20:49:01 0 53.01 0.00 1.61 0.00 0.12 45.26 20:49:01 1 53.10 0.00 1.68 0.05 0.08 45.08 20:49:01 2 53.32 0.00 1.71 0.17 0.10 44.71 20:49:01 3 50.74 0.00 1.97 0.60 0.10 46.59 20:50:01 all 1.67 0.00 0.24 0.17 0.11 97.82 20:50:01 0 1.77 0.00 0.37 0.02 0.08 97.76 20:50:01 1 1.16 0.00 0.20 0.02 0.15 98.47 20:50:01 2 1.67 0.00 0.22 0.25 0.10 97.75 20:50:01 3 2.08 0.00 0.17 0.39 0.08 97.28 20:51:01 all 3.25 0.00 0.42 0.37 0.10 95.86 20:51:01 0 2.81 0.00 0.66 0.00 0.08 96.45 20:51:01 1 3.32 0.00 0.54 0.08 0.08 95.98 20:51:01 2 3.20 0.00 0.19 0.52 0.10 95.99 20:51:01 3 3.66 0.00 0.32 0.89 0.12 95.01 20:52:01 all 56.02 0.00 2.19 2.30 0.11 39.38 20:52:01 0 54.62 0.00 2.07 3.66 0.10 39.54 20:52:01 1 56.45 0.00 2.37 3.33 0.12 37.74 20:52:01 2 54.62 0.00 2.64 1.14 0.12 41.48 20:52:01 3 58.40 0.00 1.69 1.07 0.10 38.73 20:53:01 all 20.35 0.00 0.82 0.36 0.10 78.37 20:53:01 0 20.23 0.00 0.79 0.13 0.10 78.74 20:53:01 1 20.31 0.00 0.88 0.27 0.08 78.46 20:53:01 2 19.55 0.00 0.74 0.00 0.10 79.61 20:53:01 3 21.31 0.00 0.89 1.03 0.10 76.67 20:54:01 all 2.80 0.00 0.29 0.05 0.09 96.78 20:54:01 0 2.64 0.00 0.34 0.00 0.08 96.94 20:54:01 1 2.93 0.00 0.23 0.10 0.10 96.63 20:54:01 2 2.90 0.00 0.27 0.00 0.08 96.75 20:54:01 3 2.74 0.00 0.30 0.08 0.08 96.79 20:55:01 all 37.70 0.00 1.31 0.10 0.10 60.80 20:55:01 0 37.47 0.00 1.34 0.10 0.10 60.99 20:55:01 1 36.70 0.00 1.30 0.15 0.10 61.74 20:55:01 2 38.46 0.00 1.21 0.08 0.10 60.14 20:55:01 3 38.15 0.00 1.37 0.07 0.08 60.32 20:55:01 CPU %user %nice %system %iowait %steal %idle 20:56:01 all 9.41 0.00 0.41 0.01 0.11 90.05 20:56:01 0 10.06 0.00 0.44 0.03 0.12 89.36 20:56:01 1 9.66 0.00 0.32 0.00 0.08 89.93 20:56:01 2 9.19 0.00 0.44 0.00 0.12 90.26 20:56:01 3 8.75 0.00 0.45 0.00 0.13 90.67 20:57:01 all 23.41 0.00 1.01 0.04 0.10 75.44 20:57:01 0 23.62 0.00 0.95 0.10 0.10 75.23 20:57:01 1 23.99 0.00 1.17 0.05 0.12 74.67 20:57:01 2 24.32 0.00 0.90 0.00 0.10 74.68 20:57:01 3 21.73 0.00 1.03 0.00 0.08 77.16 20:58:01 all 13.37 0.00 0.59 0.35 0.09 85.59 20:58:01 0 11.61 0.00 0.62 1.13 0.08 86.56 20:58:01 1 12.88 0.00 0.44 0.00 0.07 86.61 20:58:01 2 17.07 0.00 0.69 0.07 0.10 82.07 20:58:01 3 11.93 0.00 0.62 0.22 0.10 87.13 20:59:01 all 44.81 0.00 1.45 0.39 0.10 53.25 20:59:01 0 42.69 0.00 1.48 0.85 0.10 54.88 20:59:01 1 46.92 0.00 1.51 0.37 0.10 51.11 20:59:01 2 45.51 0.00 1.59 0.03 0.08 52.78 20:59:01 3 44.14 0.00 1.21 0.32 0.10 54.23 21:00:01 all 37.71 0.00 1.35 0.40 0.10 60.44 21:00:01 0 36.61 0.00 1.46 1.49 0.10 60.34 21:00:01 1 39.38 0.00 1.19 0.03 0.12 59.28 21:00:01 2 38.12 0.00 1.39 0.03 0.08 60.37 21:00:01 3 36.73 0.00 1.36 0.05 0.08 61.78 21:01:01 all 46.16 0.00 1.69 0.31 0.12 51.72 21:01:01 0 43.35 0.00 2.19 0.65 0.10 53.70 21:01:01 1 48.84 0.00 1.39 0.03 0.10 49.64 21:01:01 2 48.86 0.00 1.72 0.54 0.13 48.75 21:01:01 3 43.59 0.00 1.48 0.02 0.13 54.78 21:02:01 all 6.52 0.00 0.25 0.06 0.08 93.10 21:02:01 0 6.47 0.00 0.29 0.20 0.07 92.98 21:02:01 1 6.44 0.00 0.20 0.00 0.08 93.27 21:02:01 2 5.64 0.00 0.25 0.00 0.08 94.02 21:02:01 3 7.50 0.00 0.25 0.03 0.08 92.13 21:03:01 all 0.53 0.00 0.14 0.01 0.08 99.23 21:03:01 0 0.45 0.00 0.10 0.03 0.08 99.33 21:03:01 1 0.47 0.00 0.18 0.00 0.08 99.26 21:03:01 2 0.82 0.00 0.13 0.00 0.08 98.96 21:03:01 3 0.39 0.00 0.15 0.02 0.08 99.36 21:04:01 all 0.25 0.00 0.10 0.01 0.08 99.56 21:04:01 0 0.29 0.00 0.19 0.02 0.10 99.41 21:04:01 1 0.27 0.00 0.10 0.00 0.08 99.55 21:04:01 2 0.15 0.00 0.03 0.00 0.03 99.78 21:04:01 3 0.30 0.00 0.08 0.02 0.08 99.51 21:05:01 all 0.45 0.00 0.13 0.01 0.08 99.34 21:05:01 0 0.27 0.00 0.13 0.02 0.08 99.50 21:05:01 1 0.32 0.00 0.13 0.00 0.08 99.47 21:05:01 2 0.10 0.00 0.08 0.00 0.07 99.75 21:05:01 3 1.12 0.00 0.15 0.02 0.08 98.63 21:06:01 all 29.37 0.00 1.00 0.31 0.08 69.24 21:06:01 0 30.54 0.00 0.84 0.75 0.08 67.79 21:06:01 1 30.85 0.00 1.22 0.17 0.08 67.68 21:06:01 2 27.76 0.00 1.20 0.03 0.08 70.92 21:06:01 3 28.35 0.00 0.75 0.27 0.07 70.56 21:06:01 CPU %user %nice %system %iowait %steal %idle 21:07:01 all 30.86 0.00 1.12 0.33 0.10 67.58 21:07:01 0 32.34 0.00 1.57 0.02 0.10 65.97 21:07:01 1 31.66 0.00 1.19 1.12 0.08 65.95 21:07:01 2 28.81 0.00 0.84 0.02 0.10 70.23 21:07:01 3 30.65 0.00 0.90 0.18 0.10 68.16 21:08:01 all 1.62 0.00 0.19 0.01 0.08 98.10 21:08:01 0 1.72 0.00 0.30 0.00 0.08 97.90 21:08:01 1 1.87 0.00 0.17 0.05 0.08 97.83 21:08:01 2 1.34 0.00 0.10 0.00 0.07 98.49 21:08:01 3 1.53 0.00 0.20 0.00 0.08 98.18 21:09:01 all 46.32 0.00 1.74 0.27 0.10 51.58 21:09:01 0 42.59 0.00 1.57 0.52 0.10 55.22 21:09:01 1 48.54 0.00 1.92 0.47 0.10 48.96 21:09:01 2 46.18 0.00 1.69 0.02 0.10 52.02 21:09:01 3 47.95 0.00 1.78 0.07 0.10 50.10 21:10:01 all 23.51 0.00 0.54 0.28 0.09 75.59 21:10:01 0 20.45 0.00 0.52 0.02 0.10 78.92 21:10:01 1 25.07 0.00 0.49 0.65 0.08 73.71 21:10:01 2 24.39 0.00 0.45 0.02 0.10 75.04 21:10:01 3 24.13 0.00 0.69 0.42 0.08 74.68 21:11:01 all 34.92 0.00 1.15 0.24 0.10 63.58 21:11:01 0 34.83 0.00 1.13 0.02 0.10 63.92 21:11:01 1 34.45 0.00 1.49 0.55 0.10 63.41 21:11:01 2 36.03 0.00 1.00 0.07 0.08 62.81 21:11:01 3 34.38 0.00 0.99 0.34 0.10 64.19 21:12:01 all 8.46 0.00 0.25 0.03 0.08 91.17 21:12:01 0 8.25 0.00 0.22 0.05 0.08 91.40 21:12:01 1 8.11 0.00 0.28 0.05 0.08 91.47 21:12:01 2 8.14 0.00 0.25 0.00 0.07 91.54 21:12:01 3 9.36 0.00 0.25 0.03 0.10 90.25 21:13:01 all 50.19 0.00 1.57 0.38 0.10 47.76 21:13:01 0 51.26 0.00 1.83 0.00 0.10 46.81 21:13:01 1 46.06 0.00 1.43 1.41 0.12 50.99 21:13:01 2 49.66 0.00 1.21 0.02 0.12 48.99 21:13:01 3 53.75 0.00 1.81 0.08 0.08 44.28 21:14:01 all 6.64 0.00 0.27 0.01 0.10 92.98 21:14:01 0 6.33 0.00 0.23 0.00 0.10 93.33 21:14:01 1 6.40 0.00 0.28 0.05 0.10 93.16 21:14:01 2 7.42 0.00 0.28 0.00 0.10 92.20 21:14:01 3 6.41 0.00 0.27 0.00 0.10 93.22 21:15:01 all 4.06 0.00 0.22 0.01 0.09 95.63 21:15:01 0 3.65 0.00 0.28 0.02 0.10 95.95 21:15:01 1 3.61 0.00 0.18 0.03 0.05 96.12 21:15:01 2 5.40 0.00 0.18 0.00 0.12 94.31 21:15:01 3 3.55 0.00 0.22 0.00 0.08 96.15 21:16:01 all 0.65 0.00 0.10 0.00 0.07 99.17 21:16:01 0 0.35 0.00 0.03 0.00 0.03 99.58 21:16:01 1 0.20 0.00 0.12 0.02 0.10 99.56 21:16:01 2 1.71 0.00 0.13 0.00 0.08 98.08 21:16:01 3 0.33 0.00 0.12 0.00 0.07 99.48 21:17:01 all 0.23 0.00 0.13 0.01 0.10 99.53 21:17:01 0 0.22 0.00 0.10 0.02 0.07 99.60 21:17:01 1 0.27 0.00 0.18 0.03 0.12 99.40 21:17:01 2 0.20 0.00 0.12 0.00 0.08 99.60 21:17:01 3 0.23 0.00 0.12 0.00 0.12 99.53 21:17:01 CPU %user %nice %system %iowait %steal %idle 21:18:01 all 0.61 0.00 0.12 0.01 0.08 99.18 21:18:01 0 0.52 0.00 0.13 0.00 0.07 99.28 21:18:01 1 0.72 0.00 0.12 0.03 0.07 99.06 21:18:01 2 0.61 0.00 0.15 0.00 0.08 99.16 21:18:01 3 0.60 0.00 0.08 0.00 0.08 99.23 21:19:01 all 48.84 0.00 1.65 0.32 0.10 49.08 21:19:01 0 50.55 0.00 2.02 0.17 0.10 47.16 21:19:01 1 47.81 0.00 1.68 0.97 0.10 49.44 21:19:01 2 48.43 0.00 1.28 0.02 0.10 50.17 21:19:01 3 48.55 0.00 1.64 0.13 0.10 49.57 21:20:01 all 7.14 0.00 0.25 0.04 0.09 92.48 21:20:01 0 7.30 0.00 0.27 0.00 0.10 92.33 21:20:01 1 7.07 0.00 0.18 0.08 0.08 92.58 21:20:01 2 7.03 0.00 0.23 0.08 0.08 92.57 21:20:01 3 7.16 0.00 0.30 0.00 0.08 92.46 21:21:01 all 1.73 0.00 0.16 0.02 0.08 98.01 21:21:01 0 1.49 0.00 0.15 0.05 0.08 98.23 21:21:01 1 1.85 0.00 0.13 0.02 0.08 97.92 21:21:01 2 1.91 0.00 0.13 0.00 0.08 97.87 21:21:01 3 1.67 0.00 0.23 0.00 0.08 98.01 21:22:01 all 2.45 0.00 0.20 0.01 0.08 97.26 21:22:01 0 2.86 0.00 0.23 0.02 0.08 96.81 21:22:01 1 2.15 0.00 0.18 0.03 0.08 97.55 21:22:01 2 2.66 0.00 0.27 0.00 0.08 96.99 21:22:01 3 2.12 0.00 0.12 0.00 0.07 97.69 21:23:01 all 2.43 0.00 0.22 0.01 0.10 97.25 21:23:01 0 2.75 0.00 0.27 0.02 0.08 96.88 21:23:01 1 2.98 0.00 0.23 0.02 0.08 96.68 21:23:01 2 1.96 0.00 0.15 0.00 0.07 97.83 21:23:01 3 2.00 0.00 0.22 0.00 0.18 97.59 21:24:01 all 2.92 0.00 0.19 0.01 0.09 96.79 21:24:01 0 2.88 0.00 0.23 0.00 0.08 96.81 21:24:01 1 2.51 0.00 0.20 0.03 0.10 97.15 21:24:01 2 2.71 0.00 0.15 0.02 0.08 97.04 21:24:01 3 3.58 0.00 0.18 0.00 0.08 96.15 21:25:01 all 1.60 0.00 0.16 0.01 0.10 98.13 21:25:01 0 1.74 0.00 0.15 0.02 0.10 97.99 21:25:01 1 1.85 0.00 0.18 0.02 0.10 97.84 21:25:01 2 1.64 0.00 0.13 0.00 0.08 98.14 21:25:01 3 1.17 0.00 0.17 0.00 0.13 98.53 21:26:01 all 1.89 0.00 0.16 0.03 0.08 97.85 21:26:01 0 1.81 0.00 0.17 0.02 0.08 97.92 21:26:01 1 1.70 0.00 0.10 0.08 0.07 98.05 21:26:01 2 2.07 0.00 0.22 0.00 0.10 97.61 21:26:01 3 1.96 0.00 0.15 0.00 0.08 97.81 21:27:01 all 47.10 0.00 1.50 0.28 0.09 51.04 21:27:01 0 48.20 0.00 1.19 0.32 0.10 50.19 21:27:01 1 47.26 0.00 1.87 0.79 0.08 50.00 21:27:01 2 47.84 0.00 1.30 0.00 0.08 50.77 21:27:01 3 45.09 0.00 1.63 0.00 0.08 53.20 21:28:01 all 8.44 0.00 0.39 0.02 0.10 91.06 21:28:01 0 7.85 0.00 0.38 0.00 0.10 91.67 21:28:01 1 8.40 0.00 0.37 0.08 0.10 91.05 21:28:01 2 8.65 0.00 0.28 0.00 0.08 90.98 21:28:01 3 8.85 0.00 0.51 0.00 0.10 90.54 21:28:01 CPU %user %nice %system %iowait %steal %idle 21:29:01 all 4.86 0.00 0.35 0.03 0.09 94.67 21:29:01 0 4.94 0.00 0.38 0.00 0.10 94.58 21:29:01 1 4.76 0.00 0.25 0.08 0.08 94.83 21:29:01 2 4.81 0.00 0.43 0.00 0.10 94.66 21:29:01 3 4.96 0.00 0.32 0.02 0.08 94.63 21:30:01 all 2.14 0.00 0.27 0.01 0.08 97.51 21:30:01 0 2.40 0.00 0.40 0.02 0.07 97.11 21:30:01 1 2.04 0.00 0.20 0.00 0.08 97.68 21:30:01 2 1.89 0.00 0.25 0.00 0.08 97.77 21:30:01 3 2.22 0.00 0.22 0.02 0.07 97.48 21:31:01 all 2.95 0.00 0.33 0.05 0.09 96.60 21:31:01 0 2.81 0.00 0.35 0.02 0.08 96.74 21:31:01 1 3.03 0.00 0.32 0.00 0.10 96.55 21:31:01 2 2.82 0.00 0.35 0.00 0.08 96.74 21:31:01 3 3.11 0.00 0.28 0.17 0.08 96.35 21:32:01 all 2.08 0.00 0.30 0.02 0.09 97.50 21:32:01 0 2.09 0.00 0.32 0.02 0.08 97.49 21:32:01 1 1.74 0.00 0.25 0.00 0.07 97.95 21:32:01 2 2.57 0.00 0.38 0.00 0.12 96.93 21:32:01 3 1.94 0.00 0.27 0.05 0.10 97.64 21:33:02 all 4.44 0.00 0.34 0.01 0.08 95.12 21:33:02 0 4.61 0.00 0.25 0.00 0.07 95.07 21:33:02 1 4.90 0.00 0.42 0.00 0.10 94.58 21:33:02 2 4.40 0.00 0.35 0.00 0.07 95.18 21:33:02 3 3.86 0.00 0.35 0.05 0.08 95.66 21:34:01 all 23.62 0.00 0.89 0.02 0.10 75.37 21:34:01 0 24.76 0.00 0.58 0.00 0.14 74.53 21:34:01 1 23.82 0.00 1.14 0.00 0.09 74.96 21:34:01 2 24.64 0.00 0.98 0.02 0.10 74.25 21:34:01 3 21.28 0.00 0.85 0.05 0.09 77.74 21:35:01 all 41.80 0.00 1.19 0.25 0.09 56.67 21:35:01 0 44.82 0.00 1.32 0.44 0.10 53.32 21:35:01 1 39.38 0.00 0.75 0.00 0.10 59.77 21:35:01 2 41.39 0.00 1.54 0.22 0.08 56.77 21:35:01 3 41.61 0.00 1.17 0.35 0.08 56.79 21:36:01 all 7.97 0.00 0.31 0.01 0.08 91.63 21:36:01 0 7.53 0.00 0.28 0.00 0.08 92.10 21:36:01 1 7.73 0.00 0.30 0.00 0.07 91.90 21:36:01 2 8.86 0.00 0.32 0.03 0.08 90.71 21:36:01 3 7.74 0.00 0.35 0.02 0.08 91.81 21:37:01 all 3.99 0.00 0.23 0.01 0.09 95.68 21:37:01 0 3.77 0.00 0.18 0.02 0.08 95.94 21:37:01 1 3.81 0.00 0.23 0.00 0.08 95.87 21:37:01 2 4.23 0.00 0.23 0.03 0.10 95.40 21:37:01 3 4.12 0.00 0.25 0.00 0.10 95.53 21:38:01 all 2.78 0.00 0.18 0.01 0.09 96.94 21:38:01 0 2.09 0.00 0.20 0.02 0.08 97.61 21:38:01 1 2.39 0.00 0.17 0.00 0.08 97.36 21:38:01 2 4.18 0.00 0.13 0.02 0.10 95.57 21:38:01 3 2.46 0.00 0.23 0.00 0.08 97.22 21:39:01 all 4.63 0.00 0.24 0.02 0.08 95.03 21:39:01 0 4.85 0.00 0.17 0.05 0.07 94.87 21:39:01 1 3.98 0.00 0.22 0.00 0.07 95.74 21:39:01 2 5.34 0.00 0.30 0.02 0.10 94.24 21:39:01 3 4.37 0.00 0.27 0.00 0.10 95.27 21:39:01 CPU %user %nice %system %iowait %steal %idle 21:40:01 all 1.81 0.00 0.19 0.01 0.10 97.89 21:40:01 0 1.53 0.00 0.17 0.02 0.08 98.21 21:40:01 1 1.78 0.00 0.12 0.00 0.10 98.01 21:40:01 2 2.13 0.00 0.25 0.03 0.10 97.49 21:40:01 3 1.82 0.00 0.22 0.00 0.10 97.86 21:41:01 all 1.98 0.00 0.20 0.01 0.10 97.70 21:41:01 0 1.97 0.00 0.22 0.02 0.12 97.67 21:41:01 1 2.01 0.00 0.18 0.00 0.08 97.72 21:41:01 2 2.06 0.00 0.18 0.02 0.10 97.64 21:41:01 3 1.89 0.00 0.22 0.00 0.12 97.77 21:42:01 all 10.69 0.00 0.59 0.08 0.10 88.54 21:42:01 0 11.16 0.00 0.72 0.08 0.08 87.95 21:42:01 1 10.51 0.00 0.72 0.07 0.12 88.59 21:42:01 2 10.24 0.00 0.60 0.18 0.10 88.87 21:42:01 3 10.84 0.00 0.32 0.00 0.08 88.76 21:43:01 all 52.05 0.00 1.40 0.25 0.09 46.21 21:43:01 0 52.31 0.00 1.07 0.00 0.08 46.54 21:43:01 1 51.04 0.00 1.80 0.40 0.08 46.68 21:43:01 2 53.60 0.00 1.69 0.22 0.10 44.40 21:43:01 3 51.27 0.00 1.05 0.37 0.10 47.21 21:44:01 all 7.43 0.00 0.34 0.01 0.10 92.13 21:44:01 0 6.79 0.00 0.38 0.00 0.10 92.72 21:44:01 1 8.06 0.00 0.25 0.00 0.08 91.61 21:44:01 2 7.97 0.00 0.35 0.02 0.10 91.57 21:44:01 3 6.88 0.00 0.37 0.03 0.10 92.61 21:45:01 all 8.55 0.00 0.43 0.02 0.10 90.91 21:45:01 0 8.25 0.00 0.40 0.03 0.10 91.22 21:45:01 1 8.03 0.00 0.35 0.00 0.08 91.53 21:45:01 2 8.42 0.00 0.47 0.00 0.08 91.03 21:45:01 3 9.50 0.00 0.48 0.03 0.12 89.87 21:46:01 all 2.69 0.00 0.24 0.01 0.10 96.97 21:46:01 0 2.70 0.00 0.18 0.02 0.10 97.00 21:46:01 1 2.04 0.00 0.15 0.00 0.08 97.72 21:46:01 2 2.50 0.00 0.42 0.00 0.12 96.97 21:46:01 3 3.50 0.00 0.22 0.02 0.08 96.18 21:47:01 all 2.28 0.00 0.30 0.02 0.08 97.32 21:47:01 0 2.51 0.00 0.42 0.03 0.10 96.94 21:47:01 1 2.41 0.00 0.25 0.00 0.07 97.28 21:47:01 2 2.27 0.00 0.32 0.00 0.08 97.33 21:47:01 3 1.92 0.00 0.22 0.03 0.08 97.75 21:48:01 all 1.85 0.00 0.28 0.01 0.09 97.77 21:48:01 0 2.02 0.00 0.28 0.02 0.10 97.58 21:48:01 1 1.86 0.00 0.20 0.00 0.10 97.84 21:48:01 2 1.68 0.00 0.25 0.00 0.08 97.98 21:48:01 3 1.83 0.00 0.38 0.03 0.08 97.67 21:49:01 all 1.68 0.00 0.28 0.01 0.09 97.95 21:49:01 0 1.65 0.00 0.38 0.03 0.08 97.85 21:49:01 1 2.20 0.00 0.18 0.00 0.10 97.52 21:49:01 2 1.35 0.00 0.23 0.00 0.08 98.33 21:49:01 3 1.52 0.00 0.30 0.02 0.08 98.08 21:50:01 all 2.29 0.00 0.24 0.02 0.08 97.37 21:50:01 0 2.74 0.00 0.33 0.00 0.10 96.82 21:50:01 1 2.11 0.00 0.25 0.00 0.08 97.55 21:50:01 2 1.92 0.00 0.20 0.02 0.08 97.78 21:50:01 3 2.40 0.00 0.17 0.05 0.07 97.32 21:50:01 CPU %user %nice %system %iowait %steal %idle 21:51:01 all 2.12 0.00 0.25 0.02 0.10 97.51 21:51:01 0 2.29 0.00 0.35 0.02 0.12 97.23 21:51:01 1 2.06 0.00 0.25 0.00 0.10 97.59 21:51:01 2 2.32 0.00 0.20 0.02 0.10 97.36 21:51:01 3 1.80 0.00 0.22 0.03 0.08 97.86 21:52:01 all 4.94 0.00 0.53 0.11 0.08 94.33 21:52:01 0 2.62 0.00 0.45 0.15 0.08 96.69 21:52:01 1 2.55 0.00 0.44 0.00 0.10 96.92 21:52:01 2 11.53 0.00 0.70 0.08 0.07 87.61 21:52:01 3 3.04 0.00 0.55 0.22 0.08 96.11 21:53:01 all 58.21 0.00 1.65 0.62 0.10 39.42 21:53:01 0 57.88 0.00 2.21 0.97 0.10 38.84 21:53:01 1 56.24 0.00 1.27 0.80 0.10 41.59 21:53:01 2 61.35 0.00 1.17 0.60 0.10 36.77 21:53:01 3 57.38 0.00 1.96 0.08 0.10 40.48 21:54:01 all 3.22 0.00 0.20 0.01 0.09 96.48 21:54:01 0 3.69 0.00 0.17 0.02 0.10 96.03 21:54:01 1 2.68 0.00 0.18 0.03 0.08 97.02 21:54:01 2 3.79 0.00 0.22 0.00 0.08 95.91 21:54:01 3 2.73 0.00 0.22 0.00 0.08 96.97 21:55:01 all 34.80 0.00 1.07 0.03 0.08 64.03 21:55:01 0 35.21 0.00 1.07 0.02 0.07 63.64 21:55:01 1 36.79 0.00 1.16 0.05 0.07 61.93 21:55:01 2 36.23 0.00 1.19 0.00 0.10 62.48 21:55:01 3 30.97 0.00 0.87 0.03 0.08 68.05 21:56:01 all 22.88 0.00 0.54 0.28 0.10 76.21 21:56:01 0 22.41 0.00 0.74 0.08 0.08 76.69 21:56:01 1 23.83 0.00 0.54 0.85 0.10 74.68 21:56:01 2 22.93 0.00 0.49 0.08 0.12 76.38 21:56:01 3 22.34 0.00 0.40 0.08 0.08 77.09 21:57:01 all 2.28 0.00 0.15 0.01 0.09 97.46 21:57:01 0 2.08 0.00 0.17 0.00 0.10 97.65 21:57:01 1 2.61 0.00 0.20 0.03 0.12 97.03 21:57:01 2 2.26 0.00 0.10 0.02 0.07 97.56 21:57:01 3 2.17 0.00 0.13 0.00 0.08 97.61 21:58:01 all 2.27 0.00 0.18 0.01 0.09 97.45 21:58:01 0 2.40 0.00 0.10 0.00 0.08 97.41 21:58:01 1 2.22 0.00 0.23 0.02 0.12 97.41 21:58:01 2 2.19 0.00 0.20 0.03 0.08 97.49 21:58:01 3 2.24 0.00 0.18 0.00 0.08 97.49 21:59:01 all 2.31 0.00 0.15 0.02 0.07 97.45 21:59:01 0 2.14 0.00 0.13 0.02 0.05 97.66 21:59:01 1 1.81 0.00 0.15 0.05 0.08 97.91 21:59:01 2 3.03 0.00 0.15 0.02 0.07 96.74 21:59:01 3 2.27 0.00 0.15 0.00 0.08 97.49 22:00:01 all 2.90 0.00 0.22 0.01 0.08 96.79 22:00:01 0 2.33 0.00 0.25 0.02 0.10 97.30 22:00:01 1 2.85 0.00 0.22 0.00 0.08 96.85 22:00:01 2 2.70 0.00 0.18 0.02 0.08 97.02 22:00:01 3 3.71 0.00 0.22 0.00 0.07 96.01 22:01:01 all 1.91 0.00 0.18 0.01 0.08 97.81 22:01:01 0 1.75 0.00 0.18 0.02 0.07 97.98 22:01:01 1 1.93 0.00 0.22 0.00 0.10 97.76 22:01:01 2 2.23 0.00 0.18 0.03 0.08 97.47 22:01:01 3 1.76 0.00 0.15 0.00 0.07 98.03 22:01:01 CPU %user %nice %system %iowait %steal %idle 22:02:01 all 29.30 0.00 1.06 0.03 0.10 69.51 22:02:01 0 30.46 0.00 0.90 0.08 0.12 68.44 22:02:01 1 27.29 0.00 0.85 0.02 0.12 71.72 22:02:01 2 28.23 0.00 1.13 0.00 0.08 70.56 22:02:01 3 31.22 0.00 1.35 0.02 0.08 67.32 22:03:01 all 34.94 0.00 1.11 0.26 0.09 63.60 22:03:01 0 35.99 0.00 1.51 0.65 0.10 61.75 22:03:01 1 35.00 0.00 0.72 0.02 0.08 64.18 22:03:01 2 32.09 0.00 1.10 0.00 0.10 66.71 22:03:01 3 36.69 0.00 1.10 0.38 0.08 61.74 22:04:01 all 6.67 0.00 0.27 0.07 0.10 92.89 22:04:01 0 6.57 0.00 0.28 0.23 0.10 92.81 22:04:01 1 6.42 0.00 0.28 0.03 0.12 93.15 22:04:01 2 6.52 0.00 0.30 0.00 0.10 93.08 22:04:01 3 7.17 0.00 0.22 0.00 0.10 92.51 22:05:01 all 5.98 0.00 0.31 0.03 0.08 93.59 22:05:01 0 5.66 0.00 0.32 0.08 0.08 93.85 22:05:01 1 5.89 0.00 0.25 0.03 0.08 93.74 22:05:01 2 5.54 0.00 0.32 0.00 0.07 94.07 22:05:01 3 6.83 0.00 0.35 0.00 0.10 92.72 22:06:01 all 6.62 0.00 0.52 0.08 0.10 92.68 22:06:01 0 4.98 0.00 0.64 0.23 0.10 94.04 22:06:01 1 5.09 0.00 0.44 0.03 0.10 94.34 22:06:01 2 11.47 0.00 0.57 0.03 0.10 87.83 22:06:01 3 4.94 0.00 0.44 0.00 0.12 94.50 Average: all 19.76 0.12 0.92 0.33 0.09 78.78 Average: 0 19.82 0.12 0.93 0.34 0.09 78.69 Average: 1 19.65 0.12 0.94 0.35 0.09 78.84 Average: 2 20.21 0.13 0.90 0.25 0.09 78.42 Average: 3 19.36 0.10 0.92 0.37 0.09 79.16