11:58:54 Triggered by Gerrit: https://git.opendaylight.org/gerrit/c/transportpce/+/113937 11:58:54 Running as SYSTEM 11:58:54 [EnvInject] - Loading node environment variables. 11:58:54 Building remotely on prd-ubuntu2004-docker-4c-16g-43441 (ubuntu2004-docker-4c-16g) in workspace /w/workspace/transportpce-tox-verify-transportpce-master 11:58:54 [ssh-agent] Looking for ssh-agent implementation... 11:58:54 [ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine) 11:58:54 $ ssh-agent 11:58:54 SSH_AUTH_SOCK=/tmp/ssh-xRKVteOnVmud/agent.12552 11:58:54 SSH_AGENT_PID=12556 11:58:54 [ssh-agent] Started. 11:58:54 Running ssh-add (command line suppressed) 11:58:54 Identity added: /w/workspace/transportpce-tox-verify-transportpce-master@tmp/private_key_37107645823463558.key (/w/workspace/transportpce-tox-verify-transportpce-master@tmp/private_key_37107645823463558.key) 11:58:54 [ssh-agent] Using credentials jenkins (jenkins-ssh) 11:58:54 The recommended git tool is: NONE 11:58:57 using credential jenkins-ssh 11:58:57 Wiping out workspace first. 11:58:57 Cloning the remote Git repository 11:58:57 Cloning repository git://devvexx.opendaylight.org/mirror/transportpce 11:58:57 > git init /w/workspace/transportpce-tox-verify-transportpce-master # timeout=10 11:58:57 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/transportpce 11:58:57 > git --version # timeout=10 11:58:57 > git --version # 'git version 2.25.1' 11:58:57 using GIT_SSH to set credentials jenkins-ssh 11:58:57 Verifying host key using known hosts file 11:58:57 You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. 11:58:57 > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/transportpce +refs/heads/*:refs/remotes/origin/* # timeout=10 11:59:01 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/transportpce # timeout=10 11:59:01 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 11:59:03 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/transportpce # timeout=10 11:59:03 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/transportpce 11:59:03 using GIT_SSH to set credentials jenkins-ssh 11:59:03 Verifying host key using known hosts file 11:59:03 You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. 11:59:03 > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/transportpce refs/changes/37/113937/10 # timeout=10 11:59:04 > git rev-parse 119bf6c02446b8ddfa89ef539a4941bb517be426^{commit} # timeout=10 11:59:04 JENKINS-19022: warning: possible memory leak due to Git plugin usage; see: https://plugins.jenkins.io/git/#remove-git-plugin-buildsbybranch-builddata-script 11:59:04 Checking out Revision 119bf6c02446b8ddfa89ef539a4941bb517be426 (refs/changes/37/113937/10) 11:59:04 > git config core.sparsecheckout # timeout=10 11:59:04 > git checkout -f 119bf6c02446b8ddfa89ef539a4941bb517be426 # timeout=10 11:59:05 Commit message: "Add Func Test for Topology extension" 11:59:05 > git rev-parse FETCH_HEAD^{commit} # timeout=10 11:59:05 > git rev-list --no-walk 053f6334b51c8ce054fdb782bd46136161a47ce1 # timeout=10 11:59:05 > git remote # timeout=10 11:59:05 > git submodule init # timeout=10 11:59:05 > git submodule sync # timeout=10 11:59:05 > git config --get remote.origin.url # timeout=10 11:59:05 > git submodule init # timeout=10 11:59:05 > git config -f .gitmodules --get-regexp ^submodule\.(.+)\.url # timeout=10 11:59:05 ERROR: No submodules found. 11:59:08 provisioning config files... 11:59:08 copy managed file [npmrc] to file:/home/jenkins/.npmrc 11:59:08 copy managed file [pipconf] to file:/home/jenkins/.config/pip/pip.conf 11:59:08 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins5416026192293311717.sh 11:59:08 ---> python-tools-install.sh 11:59:08 Setup pyenv: 11:59:08 * system (set by /opt/pyenv/version) 11:59:08 * 3.8.13 (set by /opt/pyenv/version) 11:59:08 * 3.9.13 (set by /opt/pyenv/version) 11:59:08 * 3.10.13 (set by /opt/pyenv/version) 11:59:08 * 3.11.7 (set by /opt/pyenv/version) 11:59:13 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-L3Bd 11:59:13 lf-activate-venv(): INFO: Save venv in file: /tmp/.os_lf_venv 11:59:16 lf-activate-venv(): INFO: Installing: lftools 11:59:53 lf-activate-venv(): INFO: Adding /tmp/venv-L3Bd/bin to PATH 11:59:53 Generating Requirements File 12:00:13 Python 3.11.7 12:00:13 pip 24.2 from /tmp/venv-L3Bd/lib/python3.11/site-packages/pip (python 3.11) 12:00:13 appdirs==1.4.4 12:00:13 argcomplete==3.5.1 12:00:13 aspy.yaml==1.3.0 12:00:13 attrs==24.2.0 12:00:13 autopage==0.5.2 12:00:13 beautifulsoup4==4.12.3 12:00:13 boto3==1.35.42 12:00:13 botocore==1.35.42 12:00:13 bs4==0.0.2 12:00:13 cachetools==5.5.0 12:00:13 certifi==2024.8.30 12:00:13 cffi==1.17.1 12:00:13 cfgv==3.4.0 12:00:13 chardet==5.2.0 12:00:13 charset-normalizer==3.4.0 12:00:13 click==8.1.7 12:00:13 cliff==4.7.0 12:00:13 cmd2==2.4.3 12:00:13 cryptography==3.3.2 12:00:13 debtcollector==3.0.0 12:00:13 decorator==5.1.1 12:00:13 defusedxml==0.7.1 12:00:13 Deprecated==1.2.14 12:00:13 distlib==0.3.9 12:00:13 dnspython==2.7.0 12:00:13 docker==4.2.2 12:00:13 dogpile.cache==1.3.3 12:00:13 durationpy==0.9 12:00:13 email_validator==2.2.0 12:00:13 filelock==3.16.1 12:00:13 future==1.0.0 12:00:13 gitdb==4.0.11 12:00:13 GitPython==3.1.43 12:00:13 google-auth==2.35.0 12:00:13 httplib2==0.22.0 12:00:13 identify==2.6.1 12:00:13 idna==3.10 12:00:13 importlib-resources==1.5.0 12:00:13 iso8601==2.1.0 12:00:13 Jinja2==3.1.4 12:00:13 jmespath==1.0.1 12:00:13 jsonpatch==1.33 12:00:13 jsonpointer==3.0.0 12:00:13 jsonschema==4.23.0 12:00:13 jsonschema-specifications==2024.10.1 12:00:13 keystoneauth1==5.8.0 12:00:13 kubernetes==31.0.0 12:00:13 lftools==0.37.10 12:00:13 lxml==5.3.0 12:00:13 MarkupSafe==3.0.1 12:00:13 msgpack==1.1.0 12:00:13 multi_key_dict==2.0.3 12:00:13 munch==4.0.0 12:00:13 netaddr==1.3.0 12:00:13 netifaces==0.11.0 12:00:13 niet==1.4.2 12:00:13 nodeenv==1.9.1 12:00:13 oauth2client==4.1.3 12:00:13 oauthlib==3.2.2 12:00:13 openstacksdk==4.0.0 12:00:13 os-client-config==2.1.0 12:00:13 os-service-types==1.7.0 12:00:13 osc-lib==3.1.0 12:00:13 oslo.config==9.6.0 12:00:13 oslo.context==5.6.0 12:00:13 oslo.i18n==6.4.0 12:00:13 oslo.log==6.1.2 12:00:13 oslo.serialization==5.5.0 12:00:13 oslo.utils==7.3.0 12:00:13 packaging==24.1 12:00:13 pbr==6.1.0 12:00:13 platformdirs==4.3.6 12:00:13 prettytable==3.11.0 12:00:13 pyasn1==0.6.1 12:00:13 pyasn1_modules==0.4.1 12:00:13 pycparser==2.22 12:00:13 pygerrit2==2.0.15 12:00:13 PyGithub==2.4.0 12:00:13 PyJWT==2.9.0 12:00:13 PyNaCl==1.5.0 12:00:13 pyparsing==2.4.7 12:00:13 pyperclip==1.9.0 12:00:13 pyrsistent==0.20.0 12:00:13 python-cinderclient==9.6.0 12:00:13 python-dateutil==2.9.0.post0 12:00:13 python-heatclient==4.0.0 12:00:13 python-jenkins==1.8.2 12:00:13 python-keystoneclient==5.5.0 12:00:13 python-magnumclient==4.7.0 12:00:13 python-openstackclient==7.1.3 12:00:13 python-swiftclient==4.6.0 12:00:13 PyYAML==6.0.2 12:00:13 referencing==0.35.1 12:00:13 requests==2.32.3 12:00:13 requests-oauthlib==2.0.0 12:00:13 requestsexceptions==1.4.0 12:00:13 rfc3986==2.0.0 12:00:13 rpds-py==0.20.0 12:00:13 rsa==4.9 12:00:13 ruamel.yaml==0.18.6 12:00:13 ruamel.yaml.clib==0.2.8 12:00:13 s3transfer==0.10.3 12:00:13 simplejson==3.19.3 12:00:13 six==1.16.0 12:00:13 smmap==5.0.1 12:00:13 soupsieve==2.6 12:00:13 stevedore==5.3.0 12:00:13 tabulate==0.9.0 12:00:13 toml==0.10.2 12:00:13 tomlkit==0.13.2 12:00:13 tqdm==4.66.5 12:00:13 typing_extensions==4.12.2 12:00:13 tzdata==2024.2 12:00:13 urllib3==1.26.20 12:00:13 virtualenv==20.26.6 12:00:13 wcwidth==0.2.13 12:00:13 websocket-client==1.8.0 12:00:13 wrapt==1.16.0 12:00:13 xdg==6.0.0 12:00:13 xmltodict==0.14.2 12:00:13 yq==3.4.3 12:00:13 [EnvInject] - Injecting environment variables from a build step. 12:00:13 [EnvInject] - Injecting as environment variables the properties content 12:00:13 PYTHON=python3 12:00:13 12:00:13 [EnvInject] - Variables injected successfully. 12:00:13 [transportpce-tox-verify-transportpce-master] $ /bin/bash -l /tmp/jenkins14463725777731643214.sh 12:00:13 ---> tox-install.sh 12:00:13 + source /home/jenkins/lf-env.sh 12:00:13 + lf-activate-venv --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 12:00:13 ++ mktemp -d /tmp/venv-XXXX 12:00:13 + lf_venv=/tmp/venv-TVgW 12:00:13 + local venv_file=/tmp/.os_lf_venv 12:00:13 + local python=python3 12:00:13 + local options 12:00:13 + local set_path=true 12:00:13 + local install_args= 12:00:13 ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 12:00:13 + options=' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' 12:00:13 + eval set -- ' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' 12:00:13 ++ set -- --venv-file /tmp/.toxenv -- tox virtualenv urllib3~=1.26.15 12:00:13 + true 12:00:13 + case $1 in 12:00:13 + venv_file=/tmp/.toxenv 12:00:13 + shift 2 12:00:13 + true 12:00:13 + case $1 in 12:00:13 + shift 12:00:13 + break 12:00:13 + case $python in 12:00:13 + local pkg_list= 12:00:13 + [[ -d /opt/pyenv ]] 12:00:13 + echo 'Setup pyenv:' 12:00:13 Setup pyenv: 12:00:13 + export PYENV_ROOT=/opt/pyenv 12:00:13 + PYENV_ROOT=/opt/pyenv 12:00:13 + export PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 12:00:13 + PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 12:00:13 + pyenv versions 12:00:14 system 12:00:14 3.8.13 12:00:14 3.9.13 12:00:14 3.10.13 12:00:14 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 12:00:14 + command -v pyenv 12:00:14 ++ pyenv init - --no-rehash 12:00:14 + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); 12:00:14 for i in ${!paths[@]}; do 12:00:14 if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; 12:00:14 fi; done; 12:00:14 echo "${paths[*]}"'\'')" 12:00:14 export PATH="/opt/pyenv/shims:${PATH}" 12:00:14 export PYENV_SHELL=bash 12:00:14 source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' 12:00:14 pyenv() { 12:00:14 local command 12:00:14 command="${1:-}" 12:00:14 if [ "$#" -gt 0 ]; then 12:00:14 shift 12:00:14 fi 12:00:14 12:00:14 case "$command" in 12:00:14 rehash|shell) 12:00:14 eval "$(pyenv "sh-$command" "$@")" 12:00:14 ;; 12:00:14 *) 12:00:14 command pyenv "$command" "$@" 12:00:14 ;; 12:00:14 esac 12:00:14 }' 12:00:14 +++ bash --norc -ec 'IFS=:; paths=($PATH); 12:00:14 for i in ${!paths[@]}; do 12:00:14 if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; 12:00:14 fi; done; 12:00:14 echo "${paths[*]}"' 12:00:14 ++ PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 12:00:14 ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 12:00:14 ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 12:00:14 ++ export PYENV_SHELL=bash 12:00:14 ++ PYENV_SHELL=bash 12:00:14 ++ source /opt/pyenv/libexec/../completions/pyenv.bash 12:00:14 +++ complete -F _pyenv pyenv 12:00:14 ++ lf-pyver python3 12:00:14 ++ local py_version_xy=python3 12:00:14 ++ local py_version_xyz= 12:00:14 ++ pyenv versions 12:00:14 ++ local command 12:00:14 ++ command=versions 12:00:14 ++ '[' 1 -gt 0 ']' 12:00:14 ++ shift 12:00:14 ++ case "$command" in 12:00:14 ++ command pyenv versions 12:00:14 ++ pyenv versions 12:00:14 ++ sed 's/^[ *]* //' 12:00:14 ++ grep -E '^[0-9.]*[0-9]$' 12:00:14 ++ awk '{ print $1 }' 12:00:14 ++ [[ ! -s /tmp/.pyenv_versions ]] 12:00:14 +++ grep '^3' /tmp/.pyenv_versions 12:00:14 +++ sort -V 12:00:14 +++ tail -n 1 12:00:14 ++ py_version_xyz=3.11.7 12:00:14 ++ [[ -z 3.11.7 ]] 12:00:14 ++ echo 3.11.7 12:00:14 ++ return 0 12:00:14 + pyenv local 3.11.7 12:00:14 + local command 12:00:14 + command=local 12:00:14 + '[' 2 -gt 0 ']' 12:00:14 + shift 12:00:14 + case "$command" in 12:00:14 + command pyenv local 3.11.7 12:00:14 + pyenv local 3.11.7 12:00:14 + for arg in "$@" 12:00:14 + case $arg in 12:00:14 + pkg_list+='tox ' 12:00:14 + for arg in "$@" 12:00:14 + case $arg in 12:00:14 + pkg_list+='virtualenv ' 12:00:14 + for arg in "$@" 12:00:14 + case $arg in 12:00:14 + pkg_list+='urllib3~=1.26.15 ' 12:00:14 + [[ -f /tmp/.toxenv ]] 12:00:14 + [[ ! -f /tmp/.toxenv ]] 12:00:14 + [[ -n '' ]] 12:00:14 + python3 -m venv /tmp/venv-TVgW 12:00:18 + echo 'lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-TVgW' 12:00:18 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-TVgW 12:00:18 + echo /tmp/venv-TVgW 12:00:18 + echo 'lf-activate-venv(): INFO: Save venv in file: /tmp/.toxenv' 12:00:18 lf-activate-venv(): INFO: Save venv in file: /tmp/.toxenv 12:00:18 + /tmp/venv-TVgW/bin/python3 -m pip install --upgrade --quiet pip virtualenv 12:00:20 + [[ -z tox virtualenv urllib3~=1.26.15 ]] 12:00:20 + echo 'lf-activate-venv(): INFO: Installing: tox virtualenv urllib3~=1.26.15 ' 12:00:20 lf-activate-venv(): INFO: Installing: tox virtualenv urllib3~=1.26.15 12:00:20 + /tmp/venv-TVgW/bin/python3 -m pip install --upgrade --quiet --upgrade-strategy eager tox virtualenv urllib3~=1.26.15 12:00:23 + type python3 12:00:23 + true 12:00:23 + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-TVgW/bin to PATH' 12:00:23 lf-activate-venv(): INFO: Adding /tmp/venv-TVgW/bin to PATH 12:00:23 + PATH=/tmp/venv-TVgW/bin:/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 12:00:23 + return 0 12:00:23 + python3 --version 12:00:23 Python 3.11.7 12:00:23 + python3 -m pip --version 12:00:23 pip 24.2 from /tmp/venv-TVgW/lib/python3.11/site-packages/pip (python 3.11) 12:00:23 + python3 -m pip freeze 12:00:24 cachetools==5.5.0 12:00:24 chardet==5.2.0 12:00:24 colorama==0.4.6 12:00:24 distlib==0.3.9 12:00:24 filelock==3.16.1 12:00:24 packaging==24.1 12:00:24 platformdirs==4.3.6 12:00:24 pluggy==1.5.0 12:00:24 pyproject-api==1.8.0 12:00:24 tox==4.23.0 12:00:24 urllib3==1.26.20 12:00:24 virtualenv==20.26.6 12:00:24 [transportpce-tox-verify-transportpce-master] $ /bin/sh -xe /tmp/jenkins8562887227443093445.sh 12:00:24 [EnvInject] - Injecting environment variables from a build step. 12:00:24 [EnvInject] - Injecting as environment variables the properties content 12:00:24 PARALLEL=True 12:00:24 12:00:24 [EnvInject] - Variables injected successfully. 12:00:24 [transportpce-tox-verify-transportpce-master] $ /bin/bash -l /tmp/jenkins12842756929581580101.sh 12:00:24 ---> tox-run.sh 12:00:24 + PATH=/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 12:00:24 + ARCHIVE_TOX_DIR=/w/workspace/transportpce-tox-verify-transportpce-master/archives/tox 12:00:24 + ARCHIVE_DOC_DIR=/w/workspace/transportpce-tox-verify-transportpce-master/archives/docs 12:00:24 + mkdir -p /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox 12:00:24 + cd /w/workspace/transportpce-tox-verify-transportpce-master/. 12:00:24 + source /home/jenkins/lf-env.sh 12:00:24 + lf-activate-venv --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 12:00:24 ++ mktemp -d /tmp/venv-XXXX 12:00:24 + lf_venv=/tmp/venv-cvIe 12:00:24 + local venv_file=/tmp/.os_lf_venv 12:00:24 + local python=python3 12:00:24 + local options 12:00:24 + local set_path=true 12:00:24 + local install_args= 12:00:24 ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 12:00:24 + options=' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' 12:00:24 + eval set -- ' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' 12:00:24 ++ set -- --venv-file /tmp/.toxenv -- tox virtualenv urllib3~=1.26.15 12:00:24 + true 12:00:24 + case $1 in 12:00:24 + venv_file=/tmp/.toxenv 12:00:24 + shift 2 12:00:24 + true 12:00:24 + case $1 in 12:00:24 + shift 12:00:24 + break 12:00:24 + case $python in 12:00:24 + local pkg_list= 12:00:24 + [[ -d /opt/pyenv ]] 12:00:24 + echo 'Setup pyenv:' 12:00:24 Setup pyenv: 12:00:24 + export PYENV_ROOT=/opt/pyenv 12:00:24 + PYENV_ROOT=/opt/pyenv 12:00:24 + export PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 12:00:24 + PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 12:00:24 + pyenv versions 12:00:24 system 12:00:24 3.8.13 12:00:24 3.9.13 12:00:24 3.10.13 12:00:24 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 12:00:24 + command -v pyenv 12:00:24 ++ pyenv init - --no-rehash 12:00:24 + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); 12:00:24 for i in ${!paths[@]}; do 12:00:24 if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; 12:00:24 fi; done; 12:00:24 echo "${paths[*]}"'\'')" 12:00:24 export PATH="/opt/pyenv/shims:${PATH}" 12:00:24 export PYENV_SHELL=bash 12:00:24 source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' 12:00:24 pyenv() { 12:00:24 local command 12:00:24 command="${1:-}" 12:00:24 if [ "$#" -gt 0 ]; then 12:00:24 shift 12:00:24 fi 12:00:24 12:00:24 case "$command" in 12:00:24 rehash|shell) 12:00:24 eval "$(pyenv "sh-$command" "$@")" 12:00:24 ;; 12:00:24 *) 12:00:24 command pyenv "$command" "$@" 12:00:24 ;; 12:00:24 esac 12:00:24 }' 12:00:24 +++ bash --norc -ec 'IFS=:; paths=($PATH); 12:00:24 for i in ${!paths[@]}; do 12:00:24 if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; 12:00:24 fi; done; 12:00:24 echo "${paths[*]}"' 12:00:24 ++ PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 12:00:24 ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 12:00:24 ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 12:00:24 ++ export PYENV_SHELL=bash 12:00:24 ++ PYENV_SHELL=bash 12:00:24 ++ source /opt/pyenv/libexec/../completions/pyenv.bash 12:00:24 +++ complete -F _pyenv pyenv 12:00:24 ++ lf-pyver python3 12:00:24 ++ local py_version_xy=python3 12:00:24 ++ local py_version_xyz= 12:00:24 ++ pyenv versions 12:00:24 ++ local command 12:00:24 ++ command=versions 12:00:24 ++ '[' 1 -gt 0 ']' 12:00:24 ++ shift 12:00:24 ++ case "$command" in 12:00:24 ++ command pyenv versions 12:00:24 ++ pyenv versions 12:00:24 ++ awk '{ print $1 }' 12:00:24 ++ grep -E '^[0-9.]*[0-9]$' 12:00:24 ++ sed 's/^[ *]* //' 12:00:24 ++ [[ ! -s /tmp/.pyenv_versions ]] 12:00:24 +++ grep '^3' /tmp/.pyenv_versions 12:00:24 +++ sort -V 12:00:24 +++ tail -n 1 12:00:24 ++ py_version_xyz=3.11.7 12:00:24 ++ [[ -z 3.11.7 ]] 12:00:24 ++ echo 3.11.7 12:00:24 ++ return 0 12:00:24 + pyenv local 3.11.7 12:00:24 + local command 12:00:24 + command=local 12:00:24 + '[' 2 -gt 0 ']' 12:00:24 + shift 12:00:24 + case "$command" in 12:00:24 + command pyenv local 3.11.7 12:00:24 + pyenv local 3.11.7 12:00:24 + for arg in "$@" 12:00:24 + case $arg in 12:00:24 + pkg_list+='tox ' 12:00:24 + for arg in "$@" 12:00:24 + case $arg in 12:00:24 + pkg_list+='virtualenv ' 12:00:24 + for arg in "$@" 12:00:24 + case $arg in 12:00:24 + pkg_list+='urllib3~=1.26.15 ' 12:00:24 + [[ -f /tmp/.toxenv ]] 12:00:24 ++ cat /tmp/.toxenv 12:00:24 + lf_venv=/tmp/venv-TVgW 12:00:24 + echo 'lf-activate-venv(): INFO: Reuse venv:/tmp/venv-TVgW from' file:/tmp/.toxenv 12:00:24 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-TVgW from file:/tmp/.toxenv 12:00:24 + /tmp/venv-TVgW/bin/python3 -m pip install --upgrade --quiet pip virtualenv 12:00:25 + [[ -z tox virtualenv urllib3~=1.26.15 ]] 12:00:25 + echo 'lf-activate-venv(): INFO: Installing: tox virtualenv urllib3~=1.26.15 ' 12:00:25 lf-activate-venv(): INFO: Installing: tox virtualenv urllib3~=1.26.15 12:00:25 + /tmp/venv-TVgW/bin/python3 -m pip install --upgrade --quiet --upgrade-strategy eager tox virtualenv urllib3~=1.26.15 12:00:26 + type python3 12:00:26 + true 12:00:26 + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-TVgW/bin to PATH' 12:00:26 lf-activate-venv(): INFO: Adding /tmp/venv-TVgW/bin to PATH 12:00:26 + PATH=/tmp/venv-TVgW/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 12:00:26 + return 0 12:00:26 + [[ -d /opt/pyenv ]] 12:00:26 + echo '---> Setting up pyenv' 12:00:26 ---> Setting up pyenv 12:00:26 + export PYENV_ROOT=/opt/pyenv 12:00:26 + PYENV_ROOT=/opt/pyenv 12:00:26 + export PATH=/opt/pyenv/bin:/tmp/venv-TVgW/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 12:00:26 + PATH=/opt/pyenv/bin:/tmp/venv-TVgW/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 12:00:26 ++ pwd 12:00:26 + PYTHONPATH=/w/workspace/transportpce-tox-verify-transportpce-master 12:00:26 + export PYTHONPATH 12:00:26 + export TOX_TESTENV_PASSENV=PYTHONPATH 12:00:26 + TOX_TESTENV_PASSENV=PYTHONPATH 12:00:26 + tox --version 12:00:26 4.23.0 from /tmp/venv-TVgW/lib/python3.11/site-packages/tox/__init__.py 12:00:26 + PARALLEL=True 12:00:26 + TOX_OPTIONS_LIST= 12:00:26 + [[ -n '' ]] 12:00:26 + case ${PARALLEL,,} in 12:00:26 + TOX_OPTIONS_LIST=' --parallel auto --parallel-live' 12:00:26 + tox --parallel auto --parallel-live 12:00:26 + tee -a /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tox.log 12:00:28 checkbashisms: freeze> python -m pip freeze --all 12:00:28 buildcontroller: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 12:00:28 docs-linkcheck: install_deps> python -I -m pip install -r docs/requirements.txt 12:00:28 docs: install_deps> python -I -m pip install -r docs/requirements.txt 12:00:28 checkbashisms: pip==24.2,setuptools==75.1.0,wheel==0.44.0 12:00:28 checkbashisms: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./fixCIcentOS8reposMirrors.sh 12:00:28 checkbashisms: commands[1] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sh -c 'command checkbashisms>/dev/null || sudo yum install -y devscripts-checkbashisms || sudo yum install -y devscripts-minimal || sudo yum install -y devscripts || sudo yum install -y https://archives.fedoraproject.org/pub/archive/fedora/linux/releases/31/Everything/x86_64/os/Packages/d/devscripts-checkbashisms-2.19.6-2.fc31.x86_64.rpm || (echo "checkbashisms command not found - please install it (e.g. sudo apt-get install devscripts | yum install devscripts-minimal )" >&2 && exit 1)' 12:00:28 checkbashisms: commands[2] /w/workspace/transportpce-tox-verify-transportpce-master/tests> find . -not -path '*/\.*' -name '*.sh' -exec checkbashisms -f '{}' + 12:00:29 script ./reflectwarn.sh does not appear to have a #! interpreter line; 12:00:29 you may get strange results 12:00:30 checkbashisms: OK ✔ in 2.93 seconds 12:00:30 pre-commit: install_deps> python -I -m pip install pre-commit 12:00:32 pre-commit: freeze> python -m pip freeze --all 12:00:33 pre-commit: cfgv==3.4.0,distlib==0.3.9,filelock==3.16.1,identify==2.6.1,nodeenv==1.9.1,pip==24.2,platformdirs==4.3.6,pre_commit==4.0.1,PyYAML==6.0.2,setuptools==75.1.0,virtualenv==20.26.6,wheel==0.44.0 12:00:33 pre-commit: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./fixCIcentOS8reposMirrors.sh 12:00:33 pre-commit: commands[1] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sh -c 'which cpan || sudo yum install -y perl-CPAN || (echo "cpan command not found - please install it (e.g. sudo apt-get install perl-modules | yum install perl-CPAN )" >&2 && exit 1)' 12:00:33 /usr/bin/cpan 12:00:33 pre-commit: commands[2] /w/workspace/transportpce-tox-verify-transportpce-master/tests> pre-commit run --all-files --show-diff-on-failure 12:00:33 [WARNING] hook id `remove-tabs` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. 12:00:33 [WARNING] hook id `perltidy` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. 12:00:33 [INFO] Initializing environment for https://github.com/pre-commit/pre-commit-hooks. 12:00:33 [WARNING] repo `https://github.com/pre-commit/pre-commit-hooks` uses deprecated stage names (commit, push) which will be removed in a future version. Hint: often `pre-commit autoupdate --repo https://github.com/pre-commit/pre-commit-hooks` will fix this. if it does not -- consider reporting an issue to that repo. 12:00:33 [INFO] Initializing environment for https://github.com/jorisroovers/gitlint. 12:00:34 [INFO] Initializing environment for https://github.com/jorisroovers/gitlint:./gitlint-core[trusted-deps]. 12:00:34 buildcontroller: freeze> python -m pip freeze --all 12:00:34 buildcontroller: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 12:00:34 buildcontroller: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_controller.sh 12:00:34 + update-java-alternatives -l 12:00:34 java-1.11.0-openjdk-amd64 1111 /usr/lib/jvm/java-1.11.0-openjdk-amd64 12:00:34 java-1.12.0-openjdk-amd64 1211 /usr/lib/jvm/java-1.12.0-openjdk-amd64 12:00:34 java-1.17.0-openjdk-amd64 1711 /usr/lib/jvm/java-1.17.0-openjdk-amd64 12:00:34 java-1.21.0-openjdk-amd64 2111 /usr/lib/jvm/java-1.21.0-openjdk-amd64 12:00:34 java-1.8.0-openjdk-amd64 1081 /usr/lib/jvm/java-1.8.0-openjdk-amd64 12:00:34 + sudo update-java-alternatives -s java-1.21.0-openjdk-amd64 12:00:34 [INFO] Initializing environment for https://github.com/Lucas-C/pre-commit-hooks. 12:00:34 + java -version 12:00:34 + sed -n ;s/.* version "\(.*\)\.\(.*\)\..*".*$/\1/p; 12:00:34 [INFO] Initializing environment for https://github.com/pre-commit/mirrors-autopep8. 12:00:35 [INFO] Initializing environment for https://github.com/perltidy/perltidy. 12:00:35 + JAVA_VER=21 12:00:35 + echo 21 12:00:35 21 12:00:35 + sed -n ;s/javac \(.*\)\.\(.*\)\..*.*$/\1/p; 12:00:35 + javac -version 12:00:35 + JAVAC_VER=21 12:00:35 + echo 21 12:00:35 21 12:00:35 ok, java is 21 or newer 12:00:35 + [ 21 -ge 21 ] 12:00:35 + [ 21 -ge 21 ] 12:00:35 + echo ok, java is 21 or newer 12:00:35 + wget -nv https://dlcdn.apache.org/maven/maven-3/3.9.8/binaries/apache-maven-3.9.8-bin.tar.gz -P /tmp 12:00:36 [INFO] Installing environment for https://github.com/pre-commit/pre-commit-hooks. 12:00:36 [INFO] Once installed this environment will be reused. 12:00:36 [INFO] This may take a few minutes... 12:00:36 2024-10-17 12:00:36 URL:https://dlcdn.apache.org/maven/maven-3/3.9.8/binaries/apache-maven-3.9.8-bin.tar.gz [9083702/9083702] -> "/tmp/apache-maven-3.9.8-bin.tar.gz" [1] 12:00:36 + sudo mkdir -p /opt 12:00:36 + sudo tar xf /tmp/apache-maven-3.9.8-bin.tar.gz -C /opt 12:00:36 + sudo ln -s /opt/apache-maven-3.9.8 /opt/maven 12:00:36 + sudo ln -s /opt/maven/bin/mvn /usr/bin/mvn 12:00:36 + mvn --version 12:00:36 Apache Maven 3.9.8 (36645f6c9b5079805ea5009217e36f2cffd34256) 12:00:36 Maven home: /opt/maven 12:00:36 Java version: 21.0.4, vendor: Ubuntu, runtime: /usr/lib/jvm/java-21-openjdk-amd64 12:00:36 Default locale: en, platform encoding: UTF-8 12:00:36 OS name: "linux", version: "5.4.0-190-generic", arch: "amd64", family: "unix" 12:00:36 NOTE: Picked up JDK_JAVA_OPTIONS: 12:00:36 --add-opens=java.base/java.io=ALL-UNNAMED 12:00:36 --add-opens=java.base/java.lang=ALL-UNNAMED 12:00:36 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 12:00:36 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 12:00:36 --add-opens=java.base/java.net=ALL-UNNAMED 12:00:36 --add-opens=java.base/java.nio=ALL-UNNAMED 12:00:36 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 12:00:36 --add-opens=java.base/java.nio.file=ALL-UNNAMED 12:00:36 --add-opens=java.base/java.util=ALL-UNNAMED 12:00:36 --add-opens=java.base/java.util.jar=ALL-UNNAMED 12:00:36 --add-opens=java.base/java.util.stream=ALL-UNNAMED 12:00:36 --add-opens=java.base/java.util.zip=ALL-UNNAMED 12:00:36 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 12:00:36 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 12:00:36 -Xlog:disable 12:00:40 [INFO] Installing environment for https://github.com/Lucas-C/pre-commit-hooks. 12:00:40 [INFO] Once installed this environment will be reused. 12:00:40 [INFO] This may take a few minutes... 12:00:50 [INFO] Installing environment for https://github.com/pre-commit/mirrors-autopep8. 12:00:50 [INFO] Once installed this environment will be reused. 12:00:50 [INFO] This may take a few minutes... 12:00:53 [INFO] Installing environment for https://github.com/perltidy/perltidy. 12:00:53 [INFO] Once installed this environment will be reused. 12:00:53 [INFO] This may take a few minutes... 12:00:56 docs-linkcheck: freeze> python -m pip freeze --all 12:00:57 docs: freeze> python -m pip freeze --all 12:00:57 docs-linkcheck: alabaster==1.0.0,attrs==24.2.0,babel==2.16.0,blockdiag==3.0.0,certifi==2024.8.30,charset-normalizer==3.4.0,contourpy==1.3.0,cycler==0.12.1,docutils==0.21.2,fonttools==4.54.1,funcparserlib==2.0.0a0,future==1.0.0,idna==3.10,imagesize==1.4.1,Jinja2==3.1.4,jsonschema==3.2.0,kiwisolver==1.4.7,lfdocs-conf==0.9.0,MarkupSafe==3.0.1,matplotlib==3.9.2,numpy==2.1.2,nwdiag==3.0.0,packaging==24.1,pillow==11.0.0,pip==24.2,Pygments==2.18.0,pyparsing==3.2.0,pyrsistent==0.20.0,python-dateutil==2.9.0.post0,PyYAML==6.0.2,requests==2.32.3,requests-file==1.5.1,seqdiag==3.0.0,setuptools==75.1.0,six==1.16.0,snowballstemmer==2.2.0,Sphinx==8.1.3,sphinx-bootstrap-theme==0.8.1,sphinx-data-viewer==0.1.5,sphinx-rtd-theme==3.0.1,sphinx-tabs==3.4.7,sphinxcontrib-applehelp==2.0.0,sphinxcontrib-blockdiag==3.0.0,sphinxcontrib-devhelp==2.0.0,sphinxcontrib-htmlhelp==2.1.0,sphinxcontrib-jquery==4.1,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-needs==0.7.9,sphinxcontrib-nwdiag==2.0.0,sphinxcontrib-plantuml==0.30,sphinxcontrib-qthelp==2.0.0,sphinxcontrib-seqdiag==3.0.0,sphinxcontrib-serializinghtml==2.0.0,sphinxcontrib-swaggerdoc==0.1.7,urllib3==2.2.3,webcolors==24.8.0,wheel==0.44.0 12:00:57 docs-linkcheck: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sphinx-build -q -b linkcheck -d /w/workspace/transportpce-tox-verify-transportpce-master/.tox/docs-linkcheck/tmp/doctrees ../docs/ /w/workspace/transportpce-tox-verify-transportpce-master/docs/_build/linkcheck 12:00:57 docs: alabaster==1.0.0,attrs==24.2.0,babel==2.16.0,blockdiag==3.0.0,certifi==2024.8.30,charset-normalizer==3.4.0,contourpy==1.3.0,cycler==0.12.1,docutils==0.21.2,fonttools==4.54.1,funcparserlib==2.0.0a0,future==1.0.0,idna==3.10,imagesize==1.4.1,Jinja2==3.1.4,jsonschema==3.2.0,kiwisolver==1.4.7,lfdocs-conf==0.9.0,MarkupSafe==3.0.1,matplotlib==3.9.2,numpy==2.1.2,nwdiag==3.0.0,packaging==24.1,pillow==11.0.0,pip==24.2,Pygments==2.18.0,pyparsing==3.2.0,pyrsistent==0.20.0,python-dateutil==2.9.0.post0,PyYAML==6.0.2,requests==2.32.3,requests-file==1.5.1,seqdiag==3.0.0,setuptools==75.1.0,six==1.16.0,snowballstemmer==2.2.0,Sphinx==8.1.3,sphinx-bootstrap-theme==0.8.1,sphinx-data-viewer==0.1.5,sphinx-rtd-theme==3.0.1,sphinx-tabs==3.4.7,sphinxcontrib-applehelp==2.0.0,sphinxcontrib-blockdiag==3.0.0,sphinxcontrib-devhelp==2.0.0,sphinxcontrib-htmlhelp==2.1.0,sphinxcontrib-jquery==4.1,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-needs==0.7.9,sphinxcontrib-nwdiag==2.0.0,sphinxcontrib-plantuml==0.30,sphinxcontrib-qthelp==2.0.0,sphinxcontrib-seqdiag==3.0.0,sphinxcontrib-serializinghtml==2.0.0,sphinxcontrib-swaggerdoc==0.1.7,urllib3==2.2.3,webcolors==24.8.0,wheel==0.44.0 12:00:57 docs: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sphinx-build -q -W --keep-going -b html -n -d /w/workspace/transportpce-tox-verify-transportpce-master/.tox/docs/tmp/doctrees ../docs/ /w/workspace/transportpce-tox-verify-transportpce-master/docs/_build/html 12:01:00 docs: OK ✔ in 33.02 seconds 12:01:00 pylint: install_deps> python -I -m pip install 'pylint>=2.6.0' 12:01:03 docs-linkcheck: OK ✔ in 33.8 seconds 12:01:03 pylint: freeze> python -m pip freeze --all 12:01:03 pylint: astroid==3.3.5,dill==0.3.9,isort==5.13.2,mccabe==0.7.0,pip==24.2,platformdirs==4.3.6,pylint==3.3.1,setuptools==75.1.0,tomlkit==0.13.2,wheel==0.44.0 12:01:03 pylint: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> find transportpce_tests/ -name '*.py' -exec pylint --fail-under=10 --max-line-length=120 --disable=missing-docstring,import-error --disable=fixme --disable=duplicate-code '--module-rgx=([a-z0-9_]+$)|([0-9.]{1,30}$)' '--method-rgx=(([a-z_][a-zA-Z0-9_]{2,})|(_[a-z0-9_]*)|(__[a-zA-Z][a-zA-Z0-9_]+__))$' '--variable-rgx=[a-zA-Z_][a-zA-Z0-9_]{1,30}$' '{}' + 12:01:06 trim trailing whitespace.................................................Passed 12:01:06 Tabs remover.............................................................Passed 12:01:06 autopep8.................................................................Passed 12:01:11 perltidy.................................................................Passed 12:01:12 pre-commit: commands[3] /w/workspace/transportpce-tox-verify-transportpce-master/tests> pre-commit run gitlint-ci --hook-stage manual 12:01:14 [WARNING] hook id `remove-tabs` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. 12:01:14 [WARNING] hook id `perltidy` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. 12:01:14 [INFO] Installing environment for https://github.com/jorisroovers/gitlint. 12:01:14 [INFO] Once installed this environment will be reused. 12:01:14 [INFO] This may take a few minutes... 12:01:18 gitlint..................................................................Passed 12:01:25 ************* Module 1.2.1.test03_topology 12:01:25 transportpce_tests/1.2.1/test03_topology.py:168:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/1.2.1/test03_topology.py:202:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/1.2.1/test03_topology.py:224:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/1.2.1/test03_topology.py:364:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/1.2.1/test03_topology.py:441:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/1.2.1/test03_topology.py:553:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/1.2.1/test03_topology.py:646:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/1.2.1/test03_topology.py:719:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/1.2.1/test03_topology.py:430:4: R0912: Too many branches (13/12) (too-many-branches) 12:01:25 ************* Module 1.2.1.test02_topo_portmapping 12:01:25 transportpce_tests/1.2.1/test02_topo_portmapping.py:59:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 ************* Module tapi.test03_tapi_device_change_notifications 12:01:25 transportpce_tests/tapi/test03_tapi_device_change_notifications.py:334:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/tapi/test03_tapi_device_change_notifications.py:467:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/tapi/test03_tapi_device_change_notifications.py:562:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/tapi/test03_tapi_device_change_notifications.py:728:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/tapi/test03_tapi_device_change_notifications.py:874:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/tapi/test03_tapi_device_change_notifications.py:14:0: C0302: Too many lines in module (1003/1000) (too-many-lines) 12:01:25 ************* Module hybrid.test01_device_change_notifications 12:01:25 transportpce_tests/hybrid/test01_device_change_notifications.py:238:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/hybrid/test01_device_change_notifications.py:304:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/hybrid/test01_device_change_notifications.py:359:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/hybrid/test01_device_change_notifications.py:447:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/hybrid/test01_device_change_notifications.py:533:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/hybrid/test01_device_change_notifications.py:619:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 ************* Module 2.2.1.test03_topology 12:01:25 transportpce_tests/2.2.1/test03_topology.py:171:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/2.2.1/test03_topology.py:205:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/2.2.1/test03_topology.py:227:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/2.2.1/test03_topology.py:370:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/2.2.1/test03_topology.py:449:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/2.2.1/test03_topology.py:566:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/2.2.1/test03_topology.py:660:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/2.2.1/test03_topology.py:737:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 ************* Module 2.2.1.test04_otn_topology 12:01:25 transportpce_tests/2.2.1/test04_otn_topology.py:89:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/2.2.1/test04_otn_topology.py:153:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 transportpce_tests/2.2.1/test04_otn_topology.py:120:4: R0912: Too many branches (13/12) (too-many-branches) 12:01:25 ************* Module 2.2.1.test02_topo_portmapping 12:01:25 transportpce_tests/2.2.1/test02_topo_portmapping.py:62:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) 12:01:25 12:01:25 ----------------------------------- 12:01:25 Your code has been rated at 9.97/10 12:01:25 12:01:27 pre-commit: OK ✔ in 48.92 seconds 12:01:27 pylint: exit 1 (23.55 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> find transportpce_tests/ -name '*.py' -exec pylint --fail-under=10 --max-line-length=120 --disable=missing-docstring,import-error --disable=fixme --disable=duplicate-code '--module-rgx=([a-z0-9_]+$)|([0-9.]{1,30}$)' '--method-rgx=(([a-z_][a-zA-Z0-9_]{2,})|(_[a-z0-9_]*)|(__[a-zA-Z][a-zA-Z0-9_]+__))$' '--variable-rgx=[a-zA-Z_][a-zA-Z0-9_]{1,30}$' '{}' + pid=29928 12:02:19 pylint: FAIL ✖ in 27.15 seconds 12:02:19 buildcontroller: OK ✔ in 1 minute 51.93 seconds 12:02:19 build_karaf_tests221: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 12:02:19 build_karaf_tests121: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 12:02:19 sims: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 12:02:19 testsPCE: install_deps> python -I -m pip install gnpy4tpce==2.4.7 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 12:02:25 build_karaf_tests221: freeze> python -m pip freeze --all 12:02:25 sims: freeze> python -m pip freeze --all 12:02:25 build_karaf_tests121: freeze> python -m pip freeze --all 12:02:25 build_karaf_tests221: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 12:02:25 build_karaf_tests221: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh 12:02:25 NOTE: Picked up JDK_JAVA_OPTIONS: 12:02:25 --add-opens=java.base/java.io=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.lang=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.net=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.nio=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.nio.file=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.util=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.util.jar=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.util.stream=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.util.zip=ALL-UNNAMED 12:02:25 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 12:02:25 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 12:02:25 -Xlog:disable 12:02:25 sims: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 12:02:25 sims: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./install_lightynode.sh 12:02:25 build_karaf_tests121: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 12:02:25 build_karaf_tests121: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh 12:02:25 Using lighynode version 20.1.0.2 12:02:25 Installing lightynode device to ./lightynode/lightynode-openroadm-device directory 12:02:25 NOTE: Picked up JDK_JAVA_OPTIONS: 12:02:25 --add-opens=java.base/java.io=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.lang=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.net=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.nio=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.nio.file=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.util=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.util.jar=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.util.stream=ALL-UNNAMED 12:02:25 --add-opens=java.base/java.util.zip=ALL-UNNAMED 12:02:25 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 12:02:25 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 12:02:25 -Xlog:disable 12:02:30 sims: OK ✔ in 11.14 seconds 12:02:30 build_karaf_tests71: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 12:02:43 build_karaf_tests71: freeze> python -m pip freeze --all 12:02:44 build_karaf_tests71: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 12:02:44 build_karaf_tests71: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh 12:02:44 NOTE: Picked up JDK_JAVA_OPTIONS: 12:02:44 --add-opens=java.base/java.io=ALL-UNNAMED 12:02:44 --add-opens=java.base/java.lang=ALL-UNNAMED 12:02:44 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 12:02:44 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 12:02:44 --add-opens=java.base/java.net=ALL-UNNAMED 12:02:44 --add-opens=java.base/java.nio=ALL-UNNAMED 12:02:44 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 12:02:44 --add-opens=java.base/java.nio.file=ALL-UNNAMED 12:02:44 --add-opens=java.base/java.util=ALL-UNNAMED 12:02:44 --add-opens=java.base/java.util.jar=ALL-UNNAMED 12:02:44 --add-opens=java.base/java.util.stream=ALL-UNNAMED 12:02:44 --add-opens=java.base/java.util.zip=ALL-UNNAMED 12:02:44 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 12:02:44 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 12:02:44 -Xlog:disable 12:03:09 build_karaf_tests221: OK ✔ in 50.34 seconds 12:03:09 build_karaf_tests_hybrid: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 12:03:12 build_karaf_tests121: OK ✔ in 53.71 seconds 12:03:12 tests_tapi: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 12:03:18 build_karaf_tests_hybrid: freeze> python -m pip freeze --all 12:03:18 tests_tapi: freeze> python -m pip freeze --all 12:03:18 build_karaf_tests_hybrid: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 12:03:18 build_karaf_tests_hybrid: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh 12:03:18 NOTE: Picked up JDK_JAVA_OPTIONS: 12:03:18 --add-opens=java.base/java.io=ALL-UNNAMED 12:03:18 --add-opens=java.base/java.lang=ALL-UNNAMED 12:03:18 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 12:03:18 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 12:03:18 --add-opens=java.base/java.net=ALL-UNNAMED 12:03:18 --add-opens=java.base/java.nio=ALL-UNNAMED 12:03:18 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 12:03:18 --add-opens=java.base/java.nio.file=ALL-UNNAMED 12:03:18 --add-opens=java.base/java.util=ALL-UNNAMED 12:03:18 --add-opens=java.base/java.util.jar=ALL-UNNAMED 12:03:18 --add-opens=java.base/java.util.stream=ALL-UNNAMED 12:03:18 --add-opens=java.base/java.util.zip=ALL-UNNAMED 12:03:18 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 12:03:18 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 12:03:18 -Xlog:disable 12:03:18 tests_tapi: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 12:03:18 tests_tapi: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh tapi 12:03:18 using environment variables from ./karaf221.env 12:03:18 pytest -q transportpce_tests/tapi/test01_abstracted_topology.py 12:04:02 build_karaf_tests71: OK ✔ in 1 minute 10.04 seconds 12:04:02 testsPCE: freeze> python -m pip freeze --all 12:04:02 testsPCE: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,click==8.1.7,contourpy==1.3.0,cryptography==3.3.2,cycler==0.12.1,dict2xml==1.7.6,Flask==2.1.3,Flask-Injector==0.14.0,fonttools==4.54.1,gnpy4tpce==2.4.7,idna==3.10,iniconfig==2.0.0,injector==0.22.0,itsdangerous==2.2.0,Jinja2==3.1.4,kiwisolver==1.4.7,lxml==5.3.0,MarkupSafe==3.0.1,matplotlib==3.9.2,netconf-client==3.1.1,networkx==2.8.8,numpy==1.26.4,packaging==24.1,pandas==1.5.3,paramiko==3.5.0,pbr==5.11.1,pillow==11.0.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pyparsing==3.2.0,pytest==8.3.3,python-dateutil==2.9.0.post0,pytz==2024.2,requests==2.32.3,scipy==1.14.1,setuptools==50.3.2,six==1.16.0,urllib3==2.2.3,Werkzeug==2.0.3,wheel==0.44.0,xlrd==1.2.0 12:04:02 testsPCE: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh pce 12:04:02 pytest -q transportpce_tests/pce/test01_pce.py 12:04:51 .................................................. [100%] 12:05:54 20 passed in 111.90s (0:01:51) 12:05:55 pytest -q transportpce_tests/pce/test02_pce_400G.py 12:06:13 ............. [100%] 12:06:35 9 passed in 40.36s 12:06:35 pytest -q transportpce_tests/pce/test03_gnpy.py 12:06:37 ............ [100%] 12:07:12 8 passed in 37.05s 12:07:13 pytest -q transportpce_tests/pce/test04_pce_bug_fix.py 12:07:19 ............ [100%] 12:07:24 50 passed in 245.38s (0:04:05) 12:07:24 pytest -q transportpce_tests/tapi/test02_full_topology.py 12:07:51 ... [100%] 12:07:56 3 passed in 43.67s 12:07:57 build_karaf_tests_hybrid: OK ✔ in 1 minute 6.33 seconds 12:07:57 testsPCE: OK ✔ in 5 minutes 38.09 seconds 12:07:57 tests121: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 12:08:03 tests121: freeze> python -m pip freeze --all 12:08:03 tests121: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 12:08:03 tests121: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 1.2.1 12:08:03 using environment variables from ./karaf121.env 12:08:03 pytest -q transportpce_tests/1.2.1/test01_portmapping.py 12:08:50 ...........FF.........................FF [100%] 12:15:17 =================================== FAILURES =================================== 12:15:17 _____________ TransportPCEtesting.test_12_check_openroadm_topology _____________ 12:15:17 12:15:17 self = 12:15:17 12:15:17 def test_12_check_openroadm_topology(self): 12:15:17 response = test_utils.get_ietf_network_request('openroadm-topology', 'config') 12:15:17 self.assertEqual(response['status_code'], requests.codes.ok) 12:15:17 > self.assertEqual(len(response['network'][0]['node']), 14, 'There should be 14 openroadm nodes') 12:15:17 E AssertionError: 18 != 14 : There should be 14 openroadm nodes 12:15:17 12:15:17 transportpce_tests/tapi/test02_full_topology.py:272: AssertionError 12:15:17 ____________ TransportPCEtesting.test_13_get_tapi_topology_details _____________ 12:15:17 12:15:17 self = 12:15:17 12:15:17 def test_13_get_tapi_topology_details(self): 12:15:17 self.tapi_topo["topology-id"] = test_utils.T0_FULL_MULTILAYER_TOPO_UUID 12:15:17 response = test_utils.transportpce_api_rpc_request( 12:15:17 'tapi-topology', 'get-topology-details', self.tapi_topo) 12:15:17 time.sleep(2) 12:15:17 self.assertEqual(response['status_code'], requests.codes.ok) 12:15:17 > self.assertEqual(len(response['output']['topology']['node']), 8, 'There should be 8 TAPI nodes') 12:15:17 E AssertionError: 9 != 8 : There should be 8 TAPI nodes 12:15:17 12:15:17 transportpce_tests/tapi/test02_full_topology.py:282: AssertionError 12:15:17 =========================== short test summary info ============================ 12:15:17 FAILED transportpce_tests/tapi/test02_full_topology.py::TransportPCEtesting::test_12_check_openroadm_topology 12:15:17 FAILED transportpce_tests/tapi/test02_full_topology.py::TransportPCEtesting::test_13_get_tapi_topology_details 12:15:17 2 failed, 28 passed in 473.25s (0:07:53) 12:15:17 tests_tapi: exit 1 (719.17 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh tapi pid=30668 12:15:17 tests_tapi: FAIL ✖ in 12 minutes 5.03 seconds 12:15:17 tests71: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 12:15:18 FFFFtests71: freeze> python -m pip freeze --all 12:15:23 Ftests71: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 12:15:23 tests71: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 7.1 12:15:23 using environment variables from ./karaf71.env 12:15:23 pytest -q transportpce_tests/7.1/test01_portmapping.py 12:15:24 FFFFFF [100%] 12:15:32 =================================== FAILURES =================================== 12:15:32 _________ TransportPCEPortMappingTesting.test_09_xpdr_portmapping_info _________ 12:15:32 12:15:32 self = 12:15:32 12:15:32 def _new_conn(self) -> socket.socket: 12:15:32 """Establish a socket connection and set nodelay settings on it. 12:15:32 12:15:32 :return: New socket connection. 12:15:32 """ 12:15:32 try: 12:15:32 > sock = connection.create_connection( 12:15:32 (self._dns_host, self.port), 12:15:32 self.timeout, 12:15:32 source_address=self.source_address, 12:15:32 socket_options=self.socket_options, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:15:32 raise err 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 address = ('localhost', 8182), timeout = 10, source_address = None 12:15:32 socket_options = [(6, 1, 1)] 12:15:32 12:15:32 def create_connection( 12:15:32 address: tuple[str, int], 12:15:32 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:32 source_address: tuple[str, int] | None = None, 12:15:32 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:15:32 ) -> socket.socket: 12:15:32 """Connect to *address* and return the socket object. 12:15:32 12:15:32 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:15:32 port)``) and return the socket object. Passing the optional 12:15:32 *timeout* parameter will set the timeout on the socket instance 12:15:32 before attempting to connect. If no *timeout* is supplied, the 12:15:32 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:15:32 is used. If *source_address* is set it must be a tuple of (host, port) 12:15:32 for the socket to bind as a source address before making the connection. 12:15:32 An host of '' or port 0 tells the OS to use the default. 12:15:32 """ 12:15:32 12:15:32 host, port = address 12:15:32 if host.startswith("["): 12:15:32 host = host.strip("[]") 12:15:32 err = None 12:15:32 12:15:32 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:15:32 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:15:32 # The original create_connection function always returns all records. 12:15:32 family = allowed_gai_family() 12:15:32 12:15:32 try: 12:15:32 host.encode("idna") 12:15:32 except UnicodeError: 12:15:32 raise LocationParseError(f"'{host}', label empty or too long") from None 12:15:32 12:15:32 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:15:32 af, socktype, proto, canonname, sa = res 12:15:32 sock = None 12:15:32 try: 12:15:32 sock = socket.socket(af, socktype, proto) 12:15:32 12:15:32 # If provided, set socket level options before connecting. 12:15:32 _set_socket_options(sock, socket_options) 12:15:32 12:15:32 if timeout is not _DEFAULT_TIMEOUT: 12:15:32 sock.settimeout(timeout) 12:15:32 if source_address: 12:15:32 sock.bind(source_address) 12:15:32 > sock.connect(sa) 12:15:32 E ConnectionRefusedError: [Errno 111] Connection refused 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:15:32 12:15:32 The above exception was the direct cause of the following exception: 12:15:32 12:15:32 self = 12:15:32 method = 'GET' 12:15:32 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info' 12:15:32 body = None 12:15:32 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:15:32 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:32 redirect = False, assert_same_host = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 12:15:32 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:15:32 decode_content = False, response_kw = {} 12:15:32 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info', query=None, fragment=None) 12:15:32 destination_scheme = None, conn = None, release_this_conn = True 12:15:32 http_tunnel_required = False, err = None, clean_exit = False 12:15:32 12:15:32 def urlopen( # type: ignore[override] 12:15:32 self, 12:15:32 method: str, 12:15:32 url: str, 12:15:32 body: _TYPE_BODY | None = None, 12:15:32 headers: typing.Mapping[str, str] | None = None, 12:15:32 retries: Retry | bool | int | None = None, 12:15:32 redirect: bool = True, 12:15:32 assert_same_host: bool = True, 12:15:32 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:32 pool_timeout: int | None = None, 12:15:32 release_conn: bool | None = None, 12:15:32 chunked: bool = False, 12:15:32 body_pos: _TYPE_BODY_POSITION | None = None, 12:15:32 preload_content: bool = True, 12:15:32 decode_content: bool = True, 12:15:32 **response_kw: typing.Any, 12:15:32 ) -> BaseHTTPResponse: 12:15:32 """ 12:15:32 Get a connection from the pool and perform an HTTP request. This is the 12:15:32 lowest level call for making a request, so you'll need to specify all 12:15:32 the raw details. 12:15:32 12:15:32 .. note:: 12:15:32 12:15:32 More commonly, it's appropriate to use a convenience method 12:15:32 such as :meth:`request`. 12:15:32 12:15:32 .. note:: 12:15:32 12:15:32 `release_conn` will only behave as expected if 12:15:32 `preload_content=False` because we want to make 12:15:32 `preload_content=False` the default behaviour someday soon without 12:15:32 breaking backwards compatibility. 12:15:32 12:15:32 :param method: 12:15:32 HTTP request method (such as GET, POST, PUT, etc.) 12:15:32 12:15:32 :param url: 12:15:32 The URL to perform the request on. 12:15:32 12:15:32 :param body: 12:15:32 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:15:32 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:15:32 12:15:32 :param headers: 12:15:32 Dictionary of custom headers to send, such as User-Agent, 12:15:32 If-None-Match, etc. If None, pool headers are used. If provided, 12:15:32 these headers completely replace any pool-specific headers. 12:15:32 12:15:32 :param retries: 12:15:32 Configure the number of retries to allow before raising a 12:15:32 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:15:32 12:15:32 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:15:32 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:15:32 over different types of retries. 12:15:32 Pass an integer number to retry connection errors that many times, 12:15:32 but no other types of errors. Pass zero to never retry. 12:15:32 12:15:32 If ``False``, then retries are disabled and any exception is raised 12:15:32 immediately. Also, instead of raising a MaxRetryError on redirects, 12:15:32 the redirect response will be returned. 12:15:32 12:15:32 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:15:32 12:15:32 :param redirect: 12:15:32 If True, automatically handle redirects (status codes 301, 302, 12:15:32 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:15:32 will disable redirect, too. 12:15:32 12:15:32 :param assert_same_host: 12:15:32 If ``True``, will make sure that the host of the pool requests is 12:15:32 consistent else will raise HostChangedError. When ``False``, you can 12:15:32 use the pool on an HTTP proxy and request foreign hosts. 12:15:32 12:15:32 :param timeout: 12:15:32 If specified, overrides the default timeout for this one 12:15:32 request. It may be a float (in seconds) or an instance of 12:15:32 :class:`urllib3.util.Timeout`. 12:15:32 12:15:32 :param pool_timeout: 12:15:32 If set and the pool is set to block=True, then this method will 12:15:32 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:15:32 connection is available within the time period. 12:15:32 12:15:32 :param bool preload_content: 12:15:32 If True, the response's body will be preloaded into memory. 12:15:32 12:15:32 :param bool decode_content: 12:15:32 If True, will attempt to decode the body based on the 12:15:32 'content-encoding' header. 12:15:32 12:15:32 :param release_conn: 12:15:32 If False, then the urlopen call will not release the connection 12:15:32 back into the pool once a response is received (but will release if 12:15:32 you read the entire contents of the response such as when 12:15:32 `preload_content=True`). This is useful if you're not preloading 12:15:32 the response's content immediately. You will need to call 12:15:32 ``r.release_conn()`` on the response ``r`` to return the connection 12:15:32 back into the pool. If None, it takes the value of ``preload_content`` 12:15:32 which defaults to ``True``. 12:15:32 12:15:32 :param bool chunked: 12:15:32 If True, urllib3 will send the body using chunked transfer 12:15:32 encoding. Otherwise, urllib3 will send the body using the standard 12:15:32 content-length form. Defaults to False. 12:15:32 12:15:32 :param int body_pos: 12:15:32 Position to seek to in file-like body in the event of a retry or 12:15:32 redirect. Typically this won't need to be set because urllib3 will 12:15:32 auto-populate the value when needed. 12:15:32 """ 12:15:32 parsed_url = parse_url(url) 12:15:32 destination_scheme = parsed_url.scheme 12:15:32 12:15:32 if headers is None: 12:15:32 headers = self.headers 12:15:32 12:15:32 if not isinstance(retries, Retry): 12:15:32 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:15:32 12:15:32 if release_conn is None: 12:15:32 release_conn = preload_content 12:15:32 12:15:32 # Check host 12:15:32 if assert_same_host and not self.is_same_host(url): 12:15:32 raise HostChangedError(self, url, retries) 12:15:32 12:15:32 # Ensure that the URL we're connecting to is properly encoded 12:15:32 if url.startswith("/"): 12:15:32 url = to_str(_encode_target(url)) 12:15:32 else: 12:15:32 url = to_str(parsed_url.url) 12:15:32 12:15:32 conn = None 12:15:32 12:15:32 # Track whether `conn` needs to be released before 12:15:32 # returning/raising/recursing. Update this variable if necessary, and 12:15:32 # leave `release_conn` constant throughout the function. That way, if 12:15:32 # the function recurses, the original value of `release_conn` will be 12:15:32 # passed down into the recursive call, and its value will be respected. 12:15:32 # 12:15:32 # See issue #651 [1] for details. 12:15:32 # 12:15:32 # [1] 12:15:32 release_this_conn = release_conn 12:15:32 12:15:32 http_tunnel_required = connection_requires_http_tunnel( 12:15:32 self.proxy, self.proxy_config, destination_scheme 12:15:32 ) 12:15:32 12:15:32 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:15:32 # have to copy the headers dict so we can safely change it without those 12:15:32 # changes being reflected in anyone else's copy. 12:15:32 if not http_tunnel_required: 12:15:32 headers = headers.copy() # type: ignore[attr-defined] 12:15:32 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:15:32 12:15:32 # Must keep the exception bound to a separate variable or else Python 3 12:15:32 # complains about UnboundLocalError. 12:15:32 err = None 12:15:32 12:15:32 # Keep track of whether we cleanly exited the except block. This 12:15:32 # ensures we do proper cleanup in finally. 12:15:32 clean_exit = False 12:15:32 12:15:32 # Rewind body position, if needed. Record current position 12:15:32 # for future rewinds in the event of a redirect/retry. 12:15:32 body_pos = set_file_position(body, body_pos) 12:15:32 12:15:32 try: 12:15:32 # Request a connection from the queue. 12:15:32 timeout_obj = self._get_timeout(timeout) 12:15:32 conn = self._get_conn(timeout=pool_timeout) 12:15:32 12:15:32 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:15:32 12:15:32 # Is this a closed/new connection that requires CONNECT tunnelling? 12:15:32 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:15:32 try: 12:15:32 self._prepare_proxy(conn) 12:15:32 except (BaseSSLError, OSError, SocketTimeout) as e: 12:15:32 self._raise_timeout( 12:15:32 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:15:32 ) 12:15:32 raise 12:15:32 12:15:32 # If we're going to release the connection in ``finally:``, then 12:15:32 # the response doesn't need to know about the connection. Otherwise 12:15:32 # it will also try to release it and we'll have a double-release 12:15:32 # mess. 12:15:32 response_conn = conn if not release_conn else None 12:15:32 12:15:32 # Make the request on the HTTPConnection object 12:15:32 > response = self._make_request( 12:15:32 conn, 12:15:32 method, 12:15:32 url, 12:15:32 timeout=timeout_obj, 12:15:32 body=body, 12:15:32 headers=headers, 12:15:32 chunked=chunked, 12:15:32 retries=retries, 12:15:32 response_conn=response_conn, 12:15:32 preload_content=preload_content, 12:15:32 decode_content=decode_content, 12:15:32 **response_kw, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 12:15:32 conn.request( 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 12:15:32 self.endheaders() 12:15:32 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 12:15:32 self._send_output(message_body, encode_chunked=encode_chunked) 12:15:32 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 12:15:32 self.send(msg) 12:15:32 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 12:15:32 self.connect() 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 12:15:32 self.sock = self._new_conn() 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 self = 12:15:32 12:15:32 def _new_conn(self) -> socket.socket: 12:15:32 """Establish a socket connection and set nodelay settings on it. 12:15:32 12:15:32 :return: New socket connection. 12:15:32 """ 12:15:32 try: 12:15:32 sock = connection.create_connection( 12:15:32 (self._dns_host, self.port), 12:15:32 self.timeout, 12:15:32 source_address=self.source_address, 12:15:32 socket_options=self.socket_options, 12:15:32 ) 12:15:32 except socket.gaierror as e: 12:15:32 raise NameResolutionError(self.host, self, e) from e 12:15:32 except SocketTimeout as e: 12:15:32 raise ConnectTimeoutError( 12:15:32 self, 12:15:32 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:15:32 ) from e 12:15:32 12:15:32 except OSError as e: 12:15:32 > raise NewConnectionError( 12:15:32 self, f"Failed to establish a new connection: {e}" 12:15:32 ) from e 12:15:32 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 12:15:32 12:15:32 The above exception was the direct cause of the following exception: 12:15:32 12:15:32 self = 12:15:32 request = , stream = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:32 proxies = OrderedDict() 12:15:32 12:15:32 def send( 12:15:32 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:32 ): 12:15:32 """Sends PreparedRequest object. Returns Response object. 12:15:32 12:15:32 :param request: The :class:`PreparedRequest ` being sent. 12:15:32 :param stream: (optional) Whether to stream the request content. 12:15:32 :param timeout: (optional) How long to wait for the server to send 12:15:32 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:32 read timeout) ` tuple. 12:15:32 :type timeout: float or tuple or urllib3 Timeout object 12:15:32 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:32 we verify the server's TLS certificate, or a string, in which case it 12:15:32 must be a path to a CA bundle to use 12:15:32 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:32 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:32 :rtype: requests.Response 12:15:32 """ 12:15:32 12:15:32 try: 12:15:32 conn = self.get_connection_with_tls_context( 12:15:32 request, verify, proxies=proxies, cert=cert 12:15:32 ) 12:15:32 except LocationValueError as e: 12:15:32 raise InvalidURL(e, request=request) 12:15:32 12:15:32 self.cert_verify(conn, request.url, verify, cert) 12:15:32 url = self.request_url(request, proxies) 12:15:32 self.add_headers( 12:15:32 request, 12:15:32 stream=stream, 12:15:32 timeout=timeout, 12:15:32 verify=verify, 12:15:32 cert=cert, 12:15:32 proxies=proxies, 12:15:32 ) 12:15:32 12:15:32 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:32 12:15:32 if isinstance(timeout, tuple): 12:15:32 try: 12:15:32 connect, read = timeout 12:15:32 timeout = TimeoutSauce(connect=connect, read=read) 12:15:32 except ValueError: 12:15:32 raise ValueError( 12:15:32 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:32 f"or a single float to set both timeouts to the same value." 12:15:32 ) 12:15:32 elif isinstance(timeout, TimeoutSauce): 12:15:32 pass 12:15:32 else: 12:15:32 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:32 12:15:32 try: 12:15:32 > resp = conn.urlopen( 12:15:32 method=request.method, 12:15:32 url=url, 12:15:32 body=request.body, 12:15:32 headers=request.headers, 12:15:32 redirect=False, 12:15:32 assert_same_host=False, 12:15:32 preload_content=False, 12:15:32 decode_content=False, 12:15:32 retries=self.max_retries, 12:15:32 timeout=timeout, 12:15:32 chunked=chunked, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 12:15:32 retries = retries.increment( 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:32 method = 'GET' 12:15:32 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info' 12:15:32 response = None 12:15:32 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 12:15:32 _pool = 12:15:32 _stacktrace = 12:15:32 12:15:32 def increment( 12:15:32 self, 12:15:32 method: str | None = None, 12:15:32 url: str | None = None, 12:15:32 response: BaseHTTPResponse | None = None, 12:15:32 error: Exception | None = None, 12:15:32 _pool: ConnectionPool | None = None, 12:15:32 _stacktrace: TracebackType | None = None, 12:15:32 ) -> Self: 12:15:32 """Return a new Retry object with incremented retry counters. 12:15:32 12:15:32 :param response: A response object, or None, if the server did not 12:15:32 return a response. 12:15:32 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:15:32 :param Exception error: An error encountered during the request, or 12:15:32 None if the response was received successfully. 12:15:32 12:15:32 :return: A new ``Retry`` object. 12:15:32 """ 12:15:32 if self.total is False and error: 12:15:32 # Disabled, indicate to re-raise the error. 12:15:32 raise reraise(type(error), error, _stacktrace) 12:15:32 12:15:32 total = self.total 12:15:32 if total is not None: 12:15:32 total -= 1 12:15:32 12:15:32 connect = self.connect 12:15:32 read = self.read 12:15:32 redirect = self.redirect 12:15:32 status_count = self.status 12:15:32 other = self.other 12:15:32 cause = "unknown" 12:15:32 status = None 12:15:32 redirect_location = None 12:15:32 12:15:32 if error and self._is_connection_error(error): 12:15:32 # Connect retry? 12:15:32 if connect is False: 12:15:32 raise reraise(type(error), error, _stacktrace) 12:15:32 elif connect is not None: 12:15:32 connect -= 1 12:15:32 12:15:32 elif error and self._is_read_error(error): 12:15:32 # Read retry? 12:15:32 if read is False or method is None or not self._is_method_retryable(method): 12:15:32 raise reraise(type(error), error, _stacktrace) 12:15:32 elif read is not None: 12:15:32 read -= 1 12:15:32 12:15:32 elif error: 12:15:32 # Other retry? 12:15:32 if other is not None: 12:15:32 other -= 1 12:15:32 12:15:32 elif response and response.get_redirect_location(): 12:15:32 # Redirect retry? 12:15:32 if redirect is not None: 12:15:32 redirect -= 1 12:15:32 cause = "too many redirects" 12:15:32 response_redirect_location = response.get_redirect_location() 12:15:32 if response_redirect_location: 12:15:32 redirect_location = response_redirect_location 12:15:32 status = response.status 12:15:32 12:15:32 else: 12:15:32 # Incrementing because of a server error like a 500 in 12:15:32 # status_forcelist and the given method is in the allowed_methods 12:15:32 cause = ResponseError.GENERIC_ERROR 12:15:32 if response and response.status: 12:15:32 if status_count is not None: 12:15:32 status_count -= 1 12:15:32 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:15:32 status = response.status 12:15:32 12:15:32 history = self.history + ( 12:15:32 RequestHistory(method, url, error, status, redirect_location), 12:15:32 ) 12:15:32 12:15:32 new_retry = self.new( 12:15:32 total=total, 12:15:32 connect=connect, 12:15:32 read=read, 12:15:32 redirect=redirect, 12:15:32 status=status_count, 12:15:32 other=other, 12:15:32 history=history, 12:15:32 ) 12:15:32 12:15:32 if new_retry.is_exhausted(): 12:15:32 reason = error or ResponseError(cause) 12:15:32 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:15:32 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 12:15:32 12:15:32 During handling of the above exception, another exception occurred: 12:15:32 12:15:32 self = 12:15:32 12:15:32 def test_09_xpdr_portmapping_info(self): 12:15:32 > response = test_utils.get_portmapping_node_attr("XPDRA01", "node-info", None) 12:15:32 12:15:32 transportpce_tests/1.2.1/test01_portmapping.py:109: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 transportpce_tests/common/test_utils.py:470: in get_portmapping_node_attr 12:15:32 response = get_request(target_url) 12:15:32 transportpce_tests/common/test_utils.py:116: in get_request 12:15:32 return requests.request( 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:15:32 return session.request(method=method, url=url, **kwargs) 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:15:32 resp = self.send(prep, **send_kwargs) 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:15:32 r = adapter.send(request, **kwargs) 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 self = 12:15:32 request = , stream = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:32 proxies = OrderedDict() 12:15:32 12:15:32 def send( 12:15:32 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:32 ): 12:15:32 """Sends PreparedRequest object. Returns Response object. 12:15:32 12:15:32 :param request: The :class:`PreparedRequest ` being sent. 12:15:32 :param stream: (optional) Whether to stream the request content. 12:15:32 :param timeout: (optional) How long to wait for the server to send 12:15:32 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:32 read timeout) ` tuple. 12:15:32 :type timeout: float or tuple or urllib3 Timeout object 12:15:32 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:32 we verify the server's TLS certificate, or a string, in which case it 12:15:32 must be a path to a CA bundle to use 12:15:32 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:32 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:32 :rtype: requests.Response 12:15:32 """ 12:15:32 12:15:32 try: 12:15:32 conn = self.get_connection_with_tls_context( 12:15:32 request, verify, proxies=proxies, cert=cert 12:15:32 ) 12:15:32 except LocationValueError as e: 12:15:32 raise InvalidURL(e, request=request) 12:15:32 12:15:32 self.cert_verify(conn, request.url, verify, cert) 12:15:32 url = self.request_url(request, proxies) 12:15:32 self.add_headers( 12:15:32 request, 12:15:32 stream=stream, 12:15:32 timeout=timeout, 12:15:32 verify=verify, 12:15:32 cert=cert, 12:15:32 proxies=proxies, 12:15:32 ) 12:15:32 12:15:32 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:32 12:15:32 if isinstance(timeout, tuple): 12:15:32 try: 12:15:32 connect, read = timeout 12:15:32 timeout = TimeoutSauce(connect=connect, read=read) 12:15:32 except ValueError: 12:15:32 raise ValueError( 12:15:32 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:32 f"or a single float to set both timeouts to the same value." 12:15:32 ) 12:15:32 elif isinstance(timeout, TimeoutSauce): 12:15:32 pass 12:15:32 else: 12:15:32 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:32 12:15:32 try: 12:15:32 resp = conn.urlopen( 12:15:32 method=request.method, 12:15:32 url=url, 12:15:32 body=request.body, 12:15:32 headers=request.headers, 12:15:32 redirect=False, 12:15:32 assert_same_host=False, 12:15:32 preload_content=False, 12:15:32 decode_content=False, 12:15:32 retries=self.max_retries, 12:15:32 timeout=timeout, 12:15:32 chunked=chunked, 12:15:32 ) 12:15:32 12:15:32 except (ProtocolError, OSError) as err: 12:15:32 raise ConnectionError(err, request=request) 12:15:32 12:15:32 except MaxRetryError as e: 12:15:32 if isinstance(e.reason, ConnectTimeoutError): 12:15:32 # TODO: Remove this in 3.0.0: see #2811 12:15:32 if not isinstance(e.reason, NewConnectionError): 12:15:32 raise ConnectTimeout(e, request=request) 12:15:32 12:15:32 if isinstance(e.reason, ResponseError): 12:15:32 raise RetryError(e, request=request) 12:15:32 12:15:32 if isinstance(e.reason, _ProxyError): 12:15:32 raise ProxyError(e, request=request) 12:15:32 12:15:32 if isinstance(e.reason, _SSLError): 12:15:32 # This branch is for urllib3 v1.22 and later. 12:15:32 raise SSLError(e, request=request) 12:15:32 12:15:32 > raise ConnectionError(e, request=request) 12:15:32 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 12:15:32 ----------------------------- Captured stdout call ----------------------------- 12:15:32 execution of test_09_xpdr_portmapping_info 12:15:32 _______ TransportPCEPortMappingTesting.test_10_xpdr_portmapping_NETWORK1 _______ 12:15:32 12:15:32 self = 12:15:32 12:15:32 def _new_conn(self) -> socket.socket: 12:15:32 """Establish a socket connection and set nodelay settings on it. 12:15:32 12:15:32 :return: New socket connection. 12:15:32 """ 12:15:32 try: 12:15:32 > sock = connection.create_connection( 12:15:32 (self._dns_host, self.port), 12:15:32 self.timeout, 12:15:32 source_address=self.source_address, 12:15:32 socket_options=self.socket_options, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:15:32 raise err 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 address = ('localhost', 8182), timeout = 10, source_address = None 12:15:32 socket_options = [(6, 1, 1)] 12:15:32 12:15:32 def create_connection( 12:15:32 address: tuple[str, int], 12:15:32 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:32 source_address: tuple[str, int] | None = None, 12:15:32 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:15:32 ) -> socket.socket: 12:15:32 """Connect to *address* and return the socket object. 12:15:32 12:15:32 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:15:32 port)``) and return the socket object. Passing the optional 12:15:32 *timeout* parameter will set the timeout on the socket instance 12:15:32 before attempting to connect. If no *timeout* is supplied, the 12:15:32 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:15:32 is used. If *source_address* is set it must be a tuple of (host, port) 12:15:32 for the socket to bind as a source address before making the connection. 12:15:32 An host of '' or port 0 tells the OS to use the default. 12:15:32 """ 12:15:32 12:15:32 host, port = address 12:15:32 if host.startswith("["): 12:15:32 host = host.strip("[]") 12:15:32 err = None 12:15:32 12:15:32 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:15:32 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:15:32 # The original create_connection function always returns all records. 12:15:32 family = allowed_gai_family() 12:15:32 12:15:32 try: 12:15:32 host.encode("idna") 12:15:32 except UnicodeError: 12:15:32 raise LocationParseError(f"'{host}', label empty or too long") from None 12:15:32 12:15:32 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:15:32 af, socktype, proto, canonname, sa = res 12:15:32 sock = None 12:15:32 try: 12:15:32 sock = socket.socket(af, socktype, proto) 12:15:32 12:15:32 # If provided, set socket level options before connecting. 12:15:32 _set_socket_options(sock, socket_options) 12:15:32 12:15:32 if timeout is not _DEFAULT_TIMEOUT: 12:15:32 sock.settimeout(timeout) 12:15:32 if source_address: 12:15:32 sock.bind(source_address) 12:15:32 > sock.connect(sa) 12:15:32 E ConnectionRefusedError: [Errno 111] Connection refused 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:15:32 12:15:32 The above exception was the direct cause of the following exception: 12:15:32 12:15:32 self = 12:15:32 method = 'GET' 12:15:32 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK1' 12:15:32 body = None 12:15:32 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:15:32 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:32 redirect = False, assert_same_host = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 12:15:32 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:15:32 decode_content = False, response_kw = {} 12:15:32 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK1', query=None, fragment=None) 12:15:32 destination_scheme = None, conn = None, release_this_conn = True 12:15:32 http_tunnel_required = False, err = None, clean_exit = False 12:15:32 12:15:32 def urlopen( # type: ignore[override] 12:15:32 self, 12:15:32 method: str, 12:15:32 url: str, 12:15:32 body: _TYPE_BODY | None = None, 12:15:32 headers: typing.Mapping[str, str] | None = None, 12:15:32 retries: Retry | bool | int | None = None, 12:15:32 redirect: bool = True, 12:15:32 assert_same_host: bool = True, 12:15:32 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:32 pool_timeout: int | None = None, 12:15:32 release_conn: bool | None = None, 12:15:32 chunked: bool = False, 12:15:32 body_pos: _TYPE_BODY_POSITION | None = None, 12:15:32 preload_content: bool = True, 12:15:32 decode_content: bool = True, 12:15:32 **response_kw: typing.Any, 12:15:32 ) -> BaseHTTPResponse: 12:15:32 """ 12:15:32 Get a connection from the pool and perform an HTTP request. This is the 12:15:32 lowest level call for making a request, so you'll need to specify all 12:15:32 the raw details. 12:15:32 12:15:32 .. note:: 12:15:32 12:15:32 More commonly, it's appropriate to use a convenience method 12:15:32 such as :meth:`request`. 12:15:32 12:15:32 .. note:: 12:15:32 12:15:32 `release_conn` will only behave as expected if 12:15:32 `preload_content=False` because we want to make 12:15:32 `preload_content=False` the default behaviour someday soon without 12:15:32 breaking backwards compatibility. 12:15:32 12:15:32 :param method: 12:15:32 HTTP request method (such as GET, POST, PUT, etc.) 12:15:32 12:15:32 :param url: 12:15:32 The URL to perform the request on. 12:15:32 12:15:32 :param body: 12:15:32 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:15:32 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:15:32 12:15:32 :param headers: 12:15:32 Dictionary of custom headers to send, such as User-Agent, 12:15:32 If-None-Match, etc. If None, pool headers are used. If provided, 12:15:32 these headers completely replace any pool-specific headers. 12:15:32 12:15:32 :param retries: 12:15:32 Configure the number of retries to allow before raising a 12:15:32 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:15:32 12:15:32 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:15:32 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:15:32 over different types of retries. 12:15:32 Pass an integer number to retry connection errors that many times, 12:15:32 but no other types of errors. Pass zero to never retry. 12:15:32 12:15:32 If ``False``, then retries are disabled and any exception is raised 12:15:32 immediately. Also, instead of raising a MaxRetryError on redirects, 12:15:32 the redirect response will be returned. 12:15:32 12:15:32 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:15:32 12:15:32 :param redirect: 12:15:32 If True, automatically handle redirects (status codes 301, 302, 12:15:32 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:15:32 will disable redirect, too. 12:15:32 12:15:32 :param assert_same_host: 12:15:32 If ``True``, will make sure that the host of the pool requests is 12:15:32 consistent else will raise HostChangedError. When ``False``, you can 12:15:32 use the pool on an HTTP proxy and request foreign hosts. 12:15:32 12:15:32 :param timeout: 12:15:32 If specified, overrides the default timeout for this one 12:15:32 request. It may be a float (in seconds) or an instance of 12:15:32 :class:`urllib3.util.Timeout`. 12:15:32 12:15:32 :param pool_timeout: 12:15:32 If set and the pool is set to block=True, then this method will 12:15:32 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:15:32 connection is available within the time period. 12:15:32 12:15:32 :param bool preload_content: 12:15:32 If True, the response's body will be preloaded into memory. 12:15:32 12:15:32 :param bool decode_content: 12:15:32 If True, will attempt to decode the body based on the 12:15:32 'content-encoding' header. 12:15:32 12:15:32 :param release_conn: 12:15:32 If False, then the urlopen call will not release the connection 12:15:32 back into the pool once a response is received (but will release if 12:15:32 you read the entire contents of the response such as when 12:15:32 `preload_content=True`). This is useful if you're not preloading 12:15:32 the response's content immediately. You will need to call 12:15:32 ``r.release_conn()`` on the response ``r`` to return the connection 12:15:32 back into the pool. If None, it takes the value of ``preload_content`` 12:15:32 which defaults to ``True``. 12:15:32 12:15:32 :param bool chunked: 12:15:32 If True, urllib3 will send the body using chunked transfer 12:15:32 encoding. Otherwise, urllib3 will send the body using the standard 12:15:32 content-length form. Defaults to False. 12:15:32 12:15:32 :param int body_pos: 12:15:32 Position to seek to in file-like body in the event of a retry or 12:15:32 redirect. Typically this won't need to be set because urllib3 will 12:15:32 auto-populate the value when needed. 12:15:32 """ 12:15:32 parsed_url = parse_url(url) 12:15:32 destination_scheme = parsed_url.scheme 12:15:32 12:15:32 if headers is None: 12:15:32 headers = self.headers 12:15:32 12:15:32 if not isinstance(retries, Retry): 12:15:32 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:15:32 12:15:32 if release_conn is None: 12:15:32 release_conn = preload_content 12:15:32 12:15:32 # Check host 12:15:32 if assert_same_host and not self.is_same_host(url): 12:15:32 raise HostChangedError(self, url, retries) 12:15:32 12:15:32 # Ensure that the URL we're connecting to is properly encoded 12:15:32 if url.startswith("/"): 12:15:32 url = to_str(_encode_target(url)) 12:15:32 else: 12:15:32 url = to_str(parsed_url.url) 12:15:32 12:15:32 conn = None 12:15:32 12:15:32 # Track whether `conn` needs to be released before 12:15:32 # returning/raising/recursing. Update this variable if necessary, and 12:15:32 # leave `release_conn` constant throughout the function. That way, if 12:15:32 # the function recurses, the original value of `release_conn` will be 12:15:32 # passed down into the recursive call, and its value will be respected. 12:15:32 # 12:15:32 # See issue #651 [1] for details. 12:15:32 # 12:15:32 # [1] 12:15:32 release_this_conn = release_conn 12:15:32 12:15:32 http_tunnel_required = connection_requires_http_tunnel( 12:15:32 self.proxy, self.proxy_config, destination_scheme 12:15:32 ) 12:15:32 12:15:32 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:15:32 # have to copy the headers dict so we can safely change it without those 12:15:32 # changes being reflected in anyone else's copy. 12:15:32 if not http_tunnel_required: 12:15:32 headers = headers.copy() # type: ignore[attr-defined] 12:15:32 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:15:32 12:15:32 # Must keep the exception bound to a separate variable or else Python 3 12:15:32 # complains about UnboundLocalError. 12:15:32 err = None 12:15:32 12:15:32 # Keep track of whether we cleanly exited the except block. This 12:15:32 # ensures we do proper cleanup in finally. 12:15:32 clean_exit = False 12:15:32 12:15:32 # Rewind body position, if needed. Record current position 12:15:32 # for future rewinds in the event of a redirect/retry. 12:15:32 body_pos = set_file_position(body, body_pos) 12:15:32 12:15:32 try: 12:15:32 # Request a connection from the queue. 12:15:32 timeout_obj = self._get_timeout(timeout) 12:15:32 conn = self._get_conn(timeout=pool_timeout) 12:15:32 12:15:32 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:15:32 12:15:32 # Is this a closed/new connection that requires CONNECT tunnelling? 12:15:32 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:15:32 try: 12:15:32 self._prepare_proxy(conn) 12:15:32 except (BaseSSLError, OSError, SocketTimeout) as e: 12:15:32 self._raise_timeout( 12:15:32 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:15:32 ) 12:15:32 raise 12:15:32 12:15:32 # If we're going to release the connection in ``finally:``, then 12:15:32 # the response doesn't need to know about the connection. Otherwise 12:15:32 # it will also try to release it and we'll have a double-release 12:15:32 # mess. 12:15:32 response_conn = conn if not release_conn else None 12:15:32 12:15:32 # Make the request on the HTTPConnection object 12:15:32 > response = self._make_request( 12:15:32 conn, 12:15:32 method, 12:15:32 url, 12:15:32 timeout=timeout_obj, 12:15:32 body=body, 12:15:32 headers=headers, 12:15:32 chunked=chunked, 12:15:32 retries=retries, 12:15:32 response_conn=response_conn, 12:15:32 preload_content=preload_content, 12:15:32 decode_content=decode_content, 12:15:32 **response_kw, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 12:15:32 conn.request( 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 12:15:32 self.endheaders() 12:15:32 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 12:15:32 self._send_output(message_body, encode_chunked=encode_chunked) 12:15:32 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 12:15:32 self.send(msg) 12:15:32 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 12:15:32 self.connect() 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 12:15:32 self.sock = self._new_conn() 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 self = 12:15:32 12:15:32 def _new_conn(self) -> socket.socket: 12:15:32 """Establish a socket connection and set nodelay settings on it. 12:15:32 12:15:32 :return: New socket connection. 12:15:32 """ 12:15:32 try: 12:15:32 sock = connection.create_connection( 12:15:32 (self._dns_host, self.port), 12:15:32 self.timeout, 12:15:32 source_address=self.source_address, 12:15:32 socket_options=self.socket_options, 12:15:32 ) 12:15:32 except socket.gaierror as e: 12:15:32 raise NameResolutionError(self.host, self, e) from e 12:15:32 except SocketTimeout as e: 12:15:32 raise ConnectTimeoutError( 12:15:32 self, 12:15:32 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:15:32 ) from e 12:15:32 12:15:32 except OSError as e: 12:15:32 > raise NewConnectionError( 12:15:32 self, f"Failed to establish a new connection: {e}" 12:15:32 ) from e 12:15:32 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 12:15:32 12:15:32 The above exception was the direct cause of the following exception: 12:15:32 12:15:32 self = 12:15:32 request = , stream = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:32 proxies = OrderedDict() 12:15:32 12:15:32 def send( 12:15:32 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:32 ): 12:15:32 """Sends PreparedRequest object. Returns Response object. 12:15:32 12:15:32 :param request: The :class:`PreparedRequest ` being sent. 12:15:32 :param stream: (optional) Whether to stream the request content. 12:15:32 :param timeout: (optional) How long to wait for the server to send 12:15:32 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:32 read timeout) ` tuple. 12:15:32 :type timeout: float or tuple or urllib3 Timeout object 12:15:32 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:32 we verify the server's TLS certificate, or a string, in which case it 12:15:32 must be a path to a CA bundle to use 12:15:32 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:32 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:32 :rtype: requests.Response 12:15:32 """ 12:15:32 12:15:32 try: 12:15:32 conn = self.get_connection_with_tls_context( 12:15:32 request, verify, proxies=proxies, cert=cert 12:15:32 ) 12:15:32 except LocationValueError as e: 12:15:32 raise InvalidURL(e, request=request) 12:15:32 12:15:32 self.cert_verify(conn, request.url, verify, cert) 12:15:32 url = self.request_url(request, proxies) 12:15:32 self.add_headers( 12:15:32 request, 12:15:32 stream=stream, 12:15:32 timeout=timeout, 12:15:32 verify=verify, 12:15:32 cert=cert, 12:15:32 proxies=proxies, 12:15:32 ) 12:15:32 12:15:32 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:32 12:15:32 if isinstance(timeout, tuple): 12:15:32 try: 12:15:32 connect, read = timeout 12:15:32 timeout = TimeoutSauce(connect=connect, read=read) 12:15:32 except ValueError: 12:15:32 raise ValueError( 12:15:32 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:32 f"or a single float to set both timeouts to the same value." 12:15:32 ) 12:15:32 elif isinstance(timeout, TimeoutSauce): 12:15:32 pass 12:15:32 else: 12:15:32 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:32 12:15:32 try: 12:15:32 > resp = conn.urlopen( 12:15:32 method=request.method, 12:15:32 url=url, 12:15:32 body=request.body, 12:15:32 headers=request.headers, 12:15:32 redirect=False, 12:15:32 assert_same_host=False, 12:15:32 preload_content=False, 12:15:32 decode_content=False, 12:15:32 retries=self.max_retries, 12:15:32 timeout=timeout, 12:15:32 chunked=chunked, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 12:15:32 retries = retries.increment( 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:32 method = 'GET' 12:15:32 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK1' 12:15:32 response = None 12:15:32 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 12:15:32 _pool = 12:15:32 _stacktrace = 12:15:32 12:15:32 def increment( 12:15:32 self, 12:15:32 method: str | None = None, 12:15:32 url: str | None = None, 12:15:32 response: BaseHTTPResponse | None = None, 12:15:32 error: Exception | None = None, 12:15:32 _pool: ConnectionPool | None = None, 12:15:32 _stacktrace: TracebackType | None = None, 12:15:32 ) -> Self: 12:15:32 """Return a new Retry object with incremented retry counters. 12:15:32 12:15:32 :param response: A response object, or None, if the server did not 12:15:32 return a response. 12:15:32 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:15:32 :param Exception error: An error encountered during the request, or 12:15:32 None if the response was received successfully. 12:15:32 12:15:32 :return: A new ``Retry`` object. 12:15:32 """ 12:15:32 if self.total is False and error: 12:15:32 # Disabled, indicate to re-raise the error. 12:15:32 raise reraise(type(error), error, _stacktrace) 12:15:32 12:15:32 total = self.total 12:15:32 if total is not None: 12:15:32 total -= 1 12:15:32 12:15:32 connect = self.connect 12:15:32 read = self.read 12:15:32 redirect = self.redirect 12:15:32 status_count = self.status 12:15:32 other = self.other 12:15:32 cause = "unknown" 12:15:32 status = None 12:15:32 redirect_location = None 12:15:32 12:15:32 if error and self._is_connection_error(error): 12:15:32 # Connect retry? 12:15:32 if connect is False: 12:15:32 raise reraise(type(error), error, _stacktrace) 12:15:32 elif connect is not None: 12:15:32 connect -= 1 12:15:32 12:15:32 elif error and self._is_read_error(error): 12:15:32 # Read retry? 12:15:32 if read is False or method is None or not self._is_method_retryable(method): 12:15:32 raise reraise(type(error), error, _stacktrace) 12:15:32 elif read is not None: 12:15:32 read -= 1 12:15:32 12:15:32 elif error: 12:15:32 # Other retry? 12:15:32 if other is not None: 12:15:32 other -= 1 12:15:32 12:15:32 elif response and response.get_redirect_location(): 12:15:32 # Redirect retry? 12:15:32 if redirect is not None: 12:15:32 redirect -= 1 12:15:32 cause = "too many redirects" 12:15:32 response_redirect_location = response.get_redirect_location() 12:15:32 if response_redirect_location: 12:15:32 redirect_location = response_redirect_location 12:15:32 status = response.status 12:15:32 12:15:32 else: 12:15:32 # Incrementing because of a server error like a 500 in 12:15:32 # status_forcelist and the given method is in the allowed_methods 12:15:32 cause = ResponseError.GENERIC_ERROR 12:15:32 if response and response.status: 12:15:32 if status_count is not None: 12:15:32 status_count -= 1 12:15:32 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:15:32 status = response.status 12:15:32 12:15:32 history = self.history + ( 12:15:32 RequestHistory(method, url, error, status, redirect_location), 12:15:32 ) 12:15:32 12:15:32 new_retry = self.new( 12:15:32 total=total, 12:15:32 connect=connect, 12:15:32 read=read, 12:15:32 redirect=redirect, 12:15:32 status=status_count, 12:15:32 other=other, 12:15:32 history=history, 12:15:32 ) 12:15:32 12:15:32 if new_retry.is_exhausted(): 12:15:32 reason = error or ResponseError(cause) 12:15:32 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:15:32 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK1 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 12:15:32 12:15:32 During handling of the above exception, another exception occurred: 12:15:32 12:15:32 self = 12:15:32 12:15:32 def test_10_xpdr_portmapping_NETWORK1(self): 12:15:32 > response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-NETWORK1") 12:15:32 12:15:32 transportpce_tests/1.2.1/test01_portmapping.py:122: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 transportpce_tests/common/test_utils.py:470: in get_portmapping_node_attr 12:15:32 response = get_request(target_url) 12:15:32 transportpce_tests/common/test_utils.py:116: in get_request 12:15:32 return requests.request( 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:15:32 return session.request(method=method, url=url, **kwargs) 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:15:32 resp = self.send(prep, **send_kwargs) 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:15:32 r = adapter.send(request, **kwargs) 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 self = 12:15:32 request = , stream = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:32 proxies = OrderedDict() 12:15:32 12:15:32 def send( 12:15:32 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:32 ): 12:15:32 """Sends PreparedRequest object. Returns Response object. 12:15:32 12:15:32 :param request: The :class:`PreparedRequest ` being sent. 12:15:32 :param stream: (optional) Whether to stream the request content. 12:15:32 :param timeout: (optional) How long to wait for the server to send 12:15:32 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:32 read timeout) ` tuple. 12:15:32 :type timeout: float or tuple or urllib3 Timeout object 12:15:32 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:32 we verify the server's TLS certificate, or a string, in which case it 12:15:32 must be a path to a CA bundle to use 12:15:32 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:32 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:32 :rtype: requests.Response 12:15:32 """ 12:15:32 12:15:32 try: 12:15:32 conn = self.get_connection_with_tls_context( 12:15:32 request, verify, proxies=proxies, cert=cert 12:15:32 ) 12:15:32 except LocationValueError as e: 12:15:32 raise InvalidURL(e, request=request) 12:15:32 12:15:32 self.cert_verify(conn, request.url, verify, cert) 12:15:32 url = self.request_url(request, proxies) 12:15:32 self.add_headers( 12:15:32 request, 12:15:32 stream=stream, 12:15:32 timeout=timeout, 12:15:32 verify=verify, 12:15:32 cert=cert, 12:15:32 proxies=proxies, 12:15:32 ) 12:15:32 12:15:32 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:32 12:15:32 if isinstance(timeout, tuple): 12:15:32 try: 12:15:32 connect, read = timeout 12:15:32 timeout = TimeoutSauce(connect=connect, read=read) 12:15:32 except ValueError: 12:15:32 raise ValueError( 12:15:32 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:32 f"or a single float to set both timeouts to the same value." 12:15:32 ) 12:15:32 elif isinstance(timeout, TimeoutSauce): 12:15:32 pass 12:15:32 else: 12:15:32 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:32 12:15:32 try: 12:15:32 resp = conn.urlopen( 12:15:32 method=request.method, 12:15:32 url=url, 12:15:32 body=request.body, 12:15:32 headers=request.headers, 12:15:32 redirect=False, 12:15:32 assert_same_host=False, 12:15:32 preload_content=False, 12:15:32 decode_content=False, 12:15:32 retries=self.max_retries, 12:15:32 timeout=timeout, 12:15:32 chunked=chunked, 12:15:32 ) 12:15:32 12:15:32 except (ProtocolError, OSError) as err: 12:15:32 raise ConnectionError(err, request=request) 12:15:32 12:15:32 except MaxRetryError as e: 12:15:32 if isinstance(e.reason, ConnectTimeoutError): 12:15:32 # TODO: Remove this in 3.0.0: see #2811 12:15:32 if not isinstance(e.reason, NewConnectionError): 12:15:32 raise ConnectTimeout(e, request=request) 12:15:32 12:15:32 if isinstance(e.reason, ResponseError): 12:15:32 raise RetryError(e, request=request) 12:15:32 12:15:32 if isinstance(e.reason, _ProxyError): 12:15:32 raise ProxyError(e, request=request) 12:15:32 12:15:32 if isinstance(e.reason, _SSLError): 12:15:32 # This branch is for urllib3 v1.22 and later. 12:15:32 raise SSLError(e, request=request) 12:15:32 12:15:32 > raise ConnectionError(e, request=request) 12:15:32 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK1 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 12:15:32 ----------------------------- Captured stdout call ----------------------------- 12:15:32 execution of test_10_xpdr_portmapping_NETWORK1 12:15:32 _______ TransportPCEPortMappingTesting.test_11_xpdr_portmapping_NETWORK2 _______ 12:15:32 12:15:32 self = 12:15:32 12:15:32 def _new_conn(self) -> socket.socket: 12:15:32 """Establish a socket connection and set nodelay settings on it. 12:15:32 12:15:32 :return: New socket connection. 12:15:32 """ 12:15:32 try: 12:15:32 > sock = connection.create_connection( 12:15:32 (self._dns_host, self.port), 12:15:32 self.timeout, 12:15:32 source_address=self.source_address, 12:15:32 socket_options=self.socket_options, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:15:32 raise err 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 address = ('localhost', 8182), timeout = 10, source_address = None 12:15:32 socket_options = [(6, 1, 1)] 12:15:32 12:15:32 def create_connection( 12:15:32 address: tuple[str, int], 12:15:32 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:32 source_address: tuple[str, int] | None = None, 12:15:32 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:15:32 ) -> socket.socket: 12:15:32 """Connect to *address* and return the socket object. 12:15:32 12:15:32 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:15:32 port)``) and return the socket object. Passing the optional 12:15:32 *timeout* parameter will set the timeout on the socket instance 12:15:32 before attempting to connect. If no *timeout* is supplied, the 12:15:32 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:15:32 is used. If *source_address* is set it must be a tuple of (host, port) 12:15:32 for the socket to bind as a source address before making the connection. 12:15:32 An host of '' or port 0 tells the OS to use the default. 12:15:32 """ 12:15:32 12:15:32 host, port = address 12:15:32 if host.startswith("["): 12:15:32 host = host.strip("[]") 12:15:32 err = None 12:15:32 12:15:32 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:15:32 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:15:32 # The original create_connection function always returns all records. 12:15:32 family = allowed_gai_family() 12:15:32 12:15:32 try: 12:15:32 host.encode("idna") 12:15:32 except UnicodeError: 12:15:32 raise LocationParseError(f"'{host}', label empty or too long") from None 12:15:32 12:15:32 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:15:32 af, socktype, proto, canonname, sa = res 12:15:32 sock = None 12:15:32 try: 12:15:32 sock = socket.socket(af, socktype, proto) 12:15:32 12:15:32 # If provided, set socket level options before connecting. 12:15:32 _set_socket_options(sock, socket_options) 12:15:32 12:15:32 if timeout is not _DEFAULT_TIMEOUT: 12:15:32 sock.settimeout(timeout) 12:15:32 if source_address: 12:15:32 sock.bind(source_address) 12:15:32 > sock.connect(sa) 12:15:32 E ConnectionRefusedError: [Errno 111] Connection refused 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:15:32 12:15:32 The above exception was the direct cause of the following exception: 12:15:32 12:15:32 self = 12:15:32 method = 'GET' 12:15:32 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK2' 12:15:32 body = None 12:15:32 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:15:32 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:32 redirect = False, assert_same_host = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 12:15:32 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:15:32 decode_content = False, response_kw = {} 12:15:32 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK2', query=None, fragment=None) 12:15:32 destination_scheme = None, conn = None, release_this_conn = True 12:15:32 http_tunnel_required = False, err = None, clean_exit = False 12:15:32 12:15:32 def urlopen( # type: ignore[override] 12:15:32 self, 12:15:32 method: str, 12:15:32 url: str, 12:15:32 body: _TYPE_BODY | None = None, 12:15:32 headers: typing.Mapping[str, str] | None = None, 12:15:32 retries: Retry | bool | int | None = None, 12:15:32 redirect: bool = True, 12:15:32 assert_same_host: bool = True, 12:15:32 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:32 pool_timeout: int | None = None, 12:15:32 release_conn: bool | None = None, 12:15:32 chunked: bool = False, 12:15:32 body_pos: _TYPE_BODY_POSITION | None = None, 12:15:32 preload_content: bool = True, 12:15:32 decode_content: bool = True, 12:15:32 **response_kw: typing.Any, 12:15:32 ) -> BaseHTTPResponse: 12:15:32 """ 12:15:32 Get a connection from the pool and perform an HTTP request. This is the 12:15:32 lowest level call for making a request, so you'll need to specify all 12:15:32 the raw details. 12:15:32 12:15:32 .. note:: 12:15:32 12:15:32 More commonly, it's appropriate to use a convenience method 12:15:32 such as :meth:`request`. 12:15:32 12:15:32 .. note:: 12:15:32 12:15:32 `release_conn` will only behave as expected if 12:15:32 `preload_content=False` because we want to make 12:15:32 `preload_content=False` the default behaviour someday soon without 12:15:32 breaking backwards compatibility. 12:15:32 12:15:32 :param method: 12:15:32 HTTP request method (such as GET, POST, PUT, etc.) 12:15:32 12:15:32 :param url: 12:15:32 The URL to perform the request on. 12:15:32 12:15:32 :param body: 12:15:32 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:15:32 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:15:32 12:15:32 :param headers: 12:15:32 Dictionary of custom headers to send, such as User-Agent, 12:15:32 If-None-Match, etc. If None, pool headers are used. If provided, 12:15:32 these headers completely replace any pool-specific headers. 12:15:32 12:15:32 :param retries: 12:15:32 Configure the number of retries to allow before raising a 12:15:32 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:15:32 12:15:32 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:15:32 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:15:32 over different types of retries. 12:15:32 Pass an integer number to retry connection errors that many times, 12:15:32 but no other types of errors. Pass zero to never retry. 12:15:32 12:15:32 If ``False``, then retries are disabled and any exception is raised 12:15:32 immediately. Also, instead of raising a MaxRetryError on redirects, 12:15:32 the redirect response will be returned. 12:15:32 12:15:32 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:15:32 12:15:32 :param redirect: 12:15:32 If True, automatically handle redirects (status codes 301, 302, 12:15:32 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:15:32 will disable redirect, too. 12:15:32 12:15:32 :param assert_same_host: 12:15:32 If ``True``, will make sure that the host of the pool requests is 12:15:32 consistent else will raise HostChangedError. When ``False``, you can 12:15:32 use the pool on an HTTP proxy and request foreign hosts. 12:15:32 12:15:32 :param timeout: 12:15:32 If specified, overrides the default timeout for this one 12:15:32 request. It may be a float (in seconds) or an instance of 12:15:32 :class:`urllib3.util.Timeout`. 12:15:32 12:15:32 :param pool_timeout: 12:15:32 If set and the pool is set to block=True, then this method will 12:15:32 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:15:32 connection is available within the time period. 12:15:32 12:15:32 :param bool preload_content: 12:15:32 If True, the response's body will be preloaded into memory. 12:15:32 12:15:32 :param bool decode_content: 12:15:32 If True, will attempt to decode the body based on the 12:15:32 'content-encoding' header. 12:15:32 12:15:32 :param release_conn: 12:15:32 If False, then the urlopen call will not release the connection 12:15:32 back into the pool once a response is received (but will release if 12:15:32 you read the entire contents of the response such as when 12:15:32 `preload_content=True`). This is useful if you're not preloading 12:15:32 the response's content immediately. You will need to call 12:15:32 ``r.release_conn()`` on the response ``r`` to return the connection 12:15:32 back into the pool. If None, it takes the value of ``preload_content`` 12:15:32 which defaults to ``True``. 12:15:32 12:15:32 :param bool chunked: 12:15:32 If True, urllib3 will send the body using chunked transfer 12:15:32 encoding. Otherwise, urllib3 will send the body using the standard 12:15:32 content-length form. Defaults to False. 12:15:32 12:15:32 :param int body_pos: 12:15:32 Position to seek to in file-like body in the event of a retry or 12:15:32 redirect. Typically this won't need to be set because urllib3 will 12:15:32 auto-populate the value when needed. 12:15:32 """ 12:15:32 parsed_url = parse_url(url) 12:15:32 destination_scheme = parsed_url.scheme 12:15:32 12:15:32 if headers is None: 12:15:32 headers = self.headers 12:15:32 12:15:32 if not isinstance(retries, Retry): 12:15:32 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:15:32 12:15:32 if release_conn is None: 12:15:32 release_conn = preload_content 12:15:32 12:15:32 # Check host 12:15:32 if assert_same_host and not self.is_same_host(url): 12:15:32 raise HostChangedError(self, url, retries) 12:15:32 12:15:32 # Ensure that the URL we're connecting to is properly encoded 12:15:32 if url.startswith("/"): 12:15:32 url = to_str(_encode_target(url)) 12:15:32 else: 12:15:32 url = to_str(parsed_url.url) 12:15:32 12:15:32 conn = None 12:15:32 12:15:32 # Track whether `conn` needs to be released before 12:15:32 # returning/raising/recursing. Update this variable if necessary, and 12:15:32 # leave `release_conn` constant throughout the function. That way, if 12:15:32 # the function recurses, the original value of `release_conn` will be 12:15:32 # passed down into the recursive call, and its value will be respected. 12:15:32 # 12:15:32 # See issue #651 [1] for details. 12:15:32 # 12:15:32 # [1] 12:15:32 release_this_conn = release_conn 12:15:32 12:15:32 http_tunnel_required = connection_requires_http_tunnel( 12:15:32 self.proxy, self.proxy_config, destination_scheme 12:15:32 ) 12:15:32 12:15:32 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:15:32 # have to copy the headers dict so we can safely change it without those 12:15:32 # changes being reflected in anyone else's copy. 12:15:32 if not http_tunnel_required: 12:15:32 headers = headers.copy() # type: ignore[attr-defined] 12:15:32 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:15:32 12:15:32 # Must keep the exception bound to a separate variable or else Python 3 12:15:32 # complains about UnboundLocalError. 12:15:32 err = None 12:15:32 12:15:32 # Keep track of whether we cleanly exited the except block. This 12:15:32 # ensures we do proper cleanup in finally. 12:15:32 clean_exit = False 12:15:32 12:15:32 # Rewind body position, if needed. Record current position 12:15:32 # for future rewinds in the event of a redirect/retry. 12:15:32 body_pos = set_file_position(body, body_pos) 12:15:32 12:15:32 try: 12:15:32 # Request a connection from the queue. 12:15:32 timeout_obj = self._get_timeout(timeout) 12:15:32 conn = self._get_conn(timeout=pool_timeout) 12:15:32 12:15:32 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:15:32 12:15:32 # Is this a closed/new connection that requires CONNECT tunnelling? 12:15:32 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:15:32 try: 12:15:32 self._prepare_proxy(conn) 12:15:32 except (BaseSSLError, OSError, SocketTimeout) as e: 12:15:32 self._raise_timeout( 12:15:32 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:15:32 ) 12:15:32 raise 12:15:32 12:15:32 # If we're going to release the connection in ``finally:``, then 12:15:32 # the response doesn't need to know about the connection. Otherwise 12:15:32 # it will also try to release it and we'll have a double-release 12:15:32 # mess. 12:15:32 response_conn = conn if not release_conn else None 12:15:32 12:15:32 # Make the request on the HTTPConnection object 12:15:32 > response = self._make_request( 12:15:32 conn, 12:15:32 method, 12:15:32 url, 12:15:32 timeout=timeout_obj, 12:15:32 body=body, 12:15:32 headers=headers, 12:15:32 chunked=chunked, 12:15:32 retries=retries, 12:15:32 response_conn=response_conn, 12:15:32 preload_content=preload_content, 12:15:32 decode_content=decode_content, 12:15:32 **response_kw, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 12:15:32 conn.request( 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 12:15:32 self.endheaders() 12:15:32 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 12:15:32 self._send_output(message_body, encode_chunked=encode_chunked) 12:15:32 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 12:15:32 self.send(msg) 12:15:32 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 12:15:32 self.connect() 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 12:15:32 self.sock = self._new_conn() 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 self = 12:15:32 12:15:32 def _new_conn(self) -> socket.socket: 12:15:32 """Establish a socket connection and set nodelay settings on it. 12:15:32 12:15:32 :return: New socket connection. 12:15:32 """ 12:15:32 try: 12:15:32 sock = connection.create_connection( 12:15:32 (self._dns_host, self.port), 12:15:32 self.timeout, 12:15:32 source_address=self.source_address, 12:15:32 socket_options=self.socket_options, 12:15:32 ) 12:15:32 except socket.gaierror as e: 12:15:32 raise NameResolutionError(self.host, self, e) from e 12:15:32 except SocketTimeout as e: 12:15:32 raise ConnectTimeoutError( 12:15:32 self, 12:15:32 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:15:32 ) from e 12:15:32 12:15:32 except OSError as e: 12:15:32 > raise NewConnectionError( 12:15:32 self, f"Failed to establish a new connection: {e}" 12:15:32 ) from e 12:15:32 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 12:15:32 12:15:32 The above exception was the direct cause of the following exception: 12:15:32 12:15:32 self = 12:15:32 request = , stream = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:32 proxies = OrderedDict() 12:15:32 12:15:32 def send( 12:15:32 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:32 ): 12:15:32 """Sends PreparedRequest object. Returns Response object. 12:15:32 12:15:32 :param request: The :class:`PreparedRequest ` being sent. 12:15:32 :param stream: (optional) Whether to stream the request content. 12:15:32 :param timeout: (optional) How long to wait for the server to send 12:15:32 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:32 read timeout) ` tuple. 12:15:32 :type timeout: float or tuple or urllib3 Timeout object 12:15:32 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:32 we verify the server's TLS certificate, or a string, in which case it 12:15:32 must be a path to a CA bundle to use 12:15:32 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:32 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:32 :rtype: requests.Response 12:15:32 """ 12:15:32 12:15:32 try: 12:15:32 conn = self.get_connection_with_tls_context( 12:15:32 request, verify, proxies=proxies, cert=cert 12:15:32 ) 12:15:32 except LocationValueError as e: 12:15:32 raise InvalidURL(e, request=request) 12:15:32 12:15:32 self.cert_verify(conn, request.url, verify, cert) 12:15:32 url = self.request_url(request, proxies) 12:15:32 self.add_headers( 12:15:32 request, 12:15:32 stream=stream, 12:15:32 timeout=timeout, 12:15:32 verify=verify, 12:15:32 cert=cert, 12:15:32 proxies=proxies, 12:15:32 ) 12:15:32 12:15:32 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:32 12:15:32 if isinstance(timeout, tuple): 12:15:32 try: 12:15:32 connect, read = timeout 12:15:32 timeout = TimeoutSauce(connect=connect, read=read) 12:15:32 except ValueError: 12:15:32 raise ValueError( 12:15:32 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:32 f"or a single float to set both timeouts to the same value." 12:15:32 ) 12:15:32 elif isinstance(timeout, TimeoutSauce): 12:15:32 pass 12:15:32 else: 12:15:32 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:32 12:15:32 try: 12:15:32 > resp = conn.urlopen( 12:15:32 method=request.method, 12:15:32 url=url, 12:15:32 body=request.body, 12:15:32 headers=request.headers, 12:15:32 redirect=False, 12:15:32 assert_same_host=False, 12:15:32 preload_content=False, 12:15:32 decode_content=False, 12:15:32 retries=self.max_retries, 12:15:32 timeout=timeout, 12:15:32 chunked=chunked, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 12:15:32 retries = retries.increment( 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:32 method = 'GET' 12:15:32 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK2' 12:15:32 response = None 12:15:32 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 12:15:32 _pool = 12:15:32 _stacktrace = 12:15:32 12:15:32 def increment( 12:15:32 self, 12:15:32 method: str | None = None, 12:15:32 url: str | None = None, 12:15:32 response: BaseHTTPResponse | None = None, 12:15:32 error: Exception | None = None, 12:15:32 _pool: ConnectionPool | None = None, 12:15:32 _stacktrace: TracebackType | None = None, 12:15:32 ) -> Self: 12:15:32 """Return a new Retry object with incremented retry counters. 12:15:32 12:15:32 :param response: A response object, or None, if the server did not 12:15:32 return a response. 12:15:32 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:15:32 :param Exception error: An error encountered during the request, or 12:15:32 None if the response was received successfully. 12:15:32 12:15:32 :return: A new ``Retry`` object. 12:15:32 """ 12:15:32 if self.total is False and error: 12:15:32 # Disabled, indicate to re-raise the error. 12:15:32 raise reraise(type(error), error, _stacktrace) 12:15:32 12:15:32 total = self.total 12:15:32 if total is not None: 12:15:32 total -= 1 12:15:32 12:15:32 connect = self.connect 12:15:32 read = self.read 12:15:32 redirect = self.redirect 12:15:32 status_count = self.status 12:15:32 other = self.other 12:15:32 cause = "unknown" 12:15:32 status = None 12:15:32 redirect_location = None 12:15:32 12:15:32 if error and self._is_connection_error(error): 12:15:32 # Connect retry? 12:15:32 if connect is False: 12:15:32 raise reraise(type(error), error, _stacktrace) 12:15:32 elif connect is not None: 12:15:32 connect -= 1 12:15:32 12:15:32 elif error and self._is_read_error(error): 12:15:32 # Read retry? 12:15:32 if read is False or method is None or not self._is_method_retryable(method): 12:15:32 raise reraise(type(error), error, _stacktrace) 12:15:32 elif read is not None: 12:15:32 read -= 1 12:15:32 12:15:32 elif error: 12:15:32 # Other retry? 12:15:32 if other is not None: 12:15:32 other -= 1 12:15:32 12:15:32 elif response and response.get_redirect_location(): 12:15:32 # Redirect retry? 12:15:32 if redirect is not None: 12:15:32 redirect -= 1 12:15:32 cause = "too many redirects" 12:15:32 response_redirect_location = response.get_redirect_location() 12:15:32 if response_redirect_location: 12:15:32 redirect_location = response_redirect_location 12:15:32 status = response.status 12:15:32 12:15:32 else: 12:15:32 # Incrementing because of a server error like a 500 in 12:15:32 # status_forcelist and the given method is in the allowed_methods 12:15:32 cause = ResponseError.GENERIC_ERROR 12:15:32 if response and response.status: 12:15:32 if status_count is not None: 12:15:32 status_count -= 1 12:15:32 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:15:32 status = response.status 12:15:32 12:15:32 history = self.history + ( 12:15:32 RequestHistory(method, url, error, status, redirect_location), 12:15:32 ) 12:15:32 12:15:32 new_retry = self.new( 12:15:32 total=total, 12:15:32 connect=connect, 12:15:32 read=read, 12:15:32 redirect=redirect, 12:15:32 status=status_count, 12:15:32 other=other, 12:15:32 history=history, 12:15:32 ) 12:15:32 12:15:32 if new_retry.is_exhausted(): 12:15:32 reason = error or ResponseError(cause) 12:15:32 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:15:32 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK2 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 12:15:32 12:15:32 During handling of the above exception, another exception occurred: 12:15:32 12:15:32 self = 12:15:32 12:15:32 def test_11_xpdr_portmapping_NETWORK2(self): 12:15:32 > response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-NETWORK2") 12:15:32 12:15:32 transportpce_tests/1.2.1/test01_portmapping.py:133: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 transportpce_tests/common/test_utils.py:470: in get_portmapping_node_attr 12:15:32 response = get_request(target_url) 12:15:32 transportpce_tests/common/test_utils.py:116: in get_request 12:15:32 return requests.request( 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:15:32 return session.request(method=method, url=url, **kwargs) 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:15:32 resp = self.send(prep, **send_kwargs) 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:15:32 r = adapter.send(request, **kwargs) 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 self = 12:15:32 request = , stream = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:32 proxies = OrderedDict() 12:15:32 12:15:32 def send( 12:15:32 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:32 ): 12:15:32 """Sends PreparedRequest object. Returns Response object. 12:15:32 12:15:32 :param request: The :class:`PreparedRequest ` being sent. 12:15:32 :param stream: (optional) Whether to stream the request content. 12:15:32 :param timeout: (optional) How long to wait for the server to send 12:15:32 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:32 read timeout) ` tuple. 12:15:32 :type timeout: float or tuple or urllib3 Timeout object 12:15:32 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:32 we verify the server's TLS certificate, or a string, in which case it 12:15:32 must be a path to a CA bundle to use 12:15:32 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:32 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:32 :rtype: requests.Response 12:15:32 """ 12:15:32 12:15:32 try: 12:15:32 conn = self.get_connection_with_tls_context( 12:15:32 request, verify, proxies=proxies, cert=cert 12:15:32 ) 12:15:32 except LocationValueError as e: 12:15:32 raise InvalidURL(e, request=request) 12:15:32 12:15:32 self.cert_verify(conn, request.url, verify, cert) 12:15:32 url = self.request_url(request, proxies) 12:15:32 self.add_headers( 12:15:32 request, 12:15:32 stream=stream, 12:15:32 timeout=timeout, 12:15:32 verify=verify, 12:15:32 cert=cert, 12:15:32 proxies=proxies, 12:15:32 ) 12:15:32 12:15:32 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:32 12:15:32 if isinstance(timeout, tuple): 12:15:32 try: 12:15:32 connect, read = timeout 12:15:32 timeout = TimeoutSauce(connect=connect, read=read) 12:15:32 except ValueError: 12:15:32 raise ValueError( 12:15:32 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:32 f"or a single float to set both timeouts to the same value." 12:15:32 ) 12:15:32 elif isinstance(timeout, TimeoutSauce): 12:15:32 pass 12:15:32 else: 12:15:32 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:32 12:15:32 try: 12:15:32 resp = conn.urlopen( 12:15:32 method=request.method, 12:15:32 url=url, 12:15:32 body=request.body, 12:15:32 headers=request.headers, 12:15:32 redirect=False, 12:15:32 assert_same_host=False, 12:15:32 preload_content=False, 12:15:32 decode_content=False, 12:15:32 retries=self.max_retries, 12:15:32 timeout=timeout, 12:15:32 chunked=chunked, 12:15:32 ) 12:15:32 12:15:32 except (ProtocolError, OSError) as err: 12:15:32 raise ConnectionError(err, request=request) 12:15:32 12:15:32 except MaxRetryError as e: 12:15:32 if isinstance(e.reason, ConnectTimeoutError): 12:15:32 # TODO: Remove this in 3.0.0: see #2811 12:15:32 if not isinstance(e.reason, NewConnectionError): 12:15:32 raise ConnectTimeout(e, request=request) 12:15:32 12:15:32 if isinstance(e.reason, ResponseError): 12:15:32 raise RetryError(e, request=request) 12:15:32 12:15:32 if isinstance(e.reason, _ProxyError): 12:15:32 raise ProxyError(e, request=request) 12:15:32 12:15:32 if isinstance(e.reason, _SSLError): 12:15:32 # This branch is for urllib3 v1.22 and later. 12:15:32 raise SSLError(e, request=request) 12:15:32 12:15:32 > raise ConnectionError(e, request=request) 12:15:32 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK2 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 12:15:32 ----------------------------- Captured stdout call ----------------------------- 12:15:32 execution of test_11_xpdr_portmapping_NETWORK2 12:15:32 _______ TransportPCEPortMappingTesting.test_12_xpdr_portmapping_CLIENT1 ________ 12:15:32 12:15:32 self = 12:15:32 12:15:32 def _new_conn(self) -> socket.socket: 12:15:32 """Establish a socket connection and set nodelay settings on it. 12:15:32 12:15:32 :return: New socket connection. 12:15:32 """ 12:15:32 try: 12:15:32 > sock = connection.create_connection( 12:15:32 (self._dns_host, self.port), 12:15:32 self.timeout, 12:15:32 source_address=self.source_address, 12:15:32 socket_options=self.socket_options, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:15:32 raise err 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 address = ('localhost', 8182), timeout = 10, source_address = None 12:15:32 socket_options = [(6, 1, 1)] 12:15:32 12:15:32 def create_connection( 12:15:32 address: tuple[str, int], 12:15:32 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:32 source_address: tuple[str, int] | None = None, 12:15:32 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:15:32 ) -> socket.socket: 12:15:32 """Connect to *address* and return the socket object. 12:15:32 12:15:32 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:15:32 port)``) and return the socket object. Passing the optional 12:15:32 *timeout* parameter will set the timeout on the socket instance 12:15:32 before attempting to connect. If no *timeout* is supplied, the 12:15:32 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:15:32 is used. If *source_address* is set it must be a tuple of (host, port) 12:15:32 for the socket to bind as a source address before making the connection. 12:15:32 An host of '' or port 0 tells the OS to use the default. 12:15:32 """ 12:15:32 12:15:32 host, port = address 12:15:32 if host.startswith("["): 12:15:32 host = host.strip("[]") 12:15:32 err = None 12:15:32 12:15:32 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:15:32 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:15:32 # The original create_connection function always returns all records. 12:15:32 family = allowed_gai_family() 12:15:32 12:15:32 try: 12:15:32 host.encode("idna") 12:15:32 except UnicodeError: 12:15:32 raise LocationParseError(f"'{host}', label empty or too long") from None 12:15:32 12:15:32 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:15:32 af, socktype, proto, canonname, sa = res 12:15:32 sock = None 12:15:32 try: 12:15:32 sock = socket.socket(af, socktype, proto) 12:15:32 12:15:32 # If provided, set socket level options before connecting. 12:15:32 _set_socket_options(sock, socket_options) 12:15:32 12:15:32 if timeout is not _DEFAULT_TIMEOUT: 12:15:32 sock.settimeout(timeout) 12:15:32 if source_address: 12:15:32 sock.bind(source_address) 12:15:32 > sock.connect(sa) 12:15:32 E ConnectionRefusedError: [Errno 111] Connection refused 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:15:32 12:15:32 The above exception was the direct cause of the following exception: 12:15:32 12:15:32 self = 12:15:32 method = 'GET' 12:15:32 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT1' 12:15:32 body = None 12:15:32 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:15:32 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:32 redirect = False, assert_same_host = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 12:15:32 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:15:32 decode_content = False, response_kw = {} 12:15:32 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT1', query=None, fragment=None) 12:15:32 destination_scheme = None, conn = None, release_this_conn = True 12:15:32 http_tunnel_required = False, err = None, clean_exit = False 12:15:32 12:15:32 def urlopen( # type: ignore[override] 12:15:32 self, 12:15:32 method: str, 12:15:32 url: str, 12:15:32 body: _TYPE_BODY | None = None, 12:15:32 headers: typing.Mapping[str, str] | None = None, 12:15:32 retries: Retry | bool | int | None = None, 12:15:32 redirect: bool = True, 12:15:32 assert_same_host: bool = True, 12:15:32 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:32 pool_timeout: int | None = None, 12:15:32 release_conn: bool | None = None, 12:15:32 chunked: bool = False, 12:15:32 body_pos: _TYPE_BODY_POSITION | None = None, 12:15:32 preload_content: bool = True, 12:15:32 decode_content: bool = True, 12:15:32 **response_kw: typing.Any, 12:15:32 ) -> BaseHTTPResponse: 12:15:32 """ 12:15:32 Get a connection from the pool and perform an HTTP request. This is the 12:15:32 lowest level call for making a request, so you'll need to specify all 12:15:32 the raw details. 12:15:32 12:15:32 .. note:: 12:15:32 12:15:32 More commonly, it's appropriate to use a convenience method 12:15:32 such as :meth:`request`. 12:15:32 12:15:32 .. note:: 12:15:32 12:15:32 `release_conn` will only behave as expected if 12:15:32 `preload_content=False` because we want to make 12:15:32 `preload_content=False` the default behaviour someday soon without 12:15:32 breaking backwards compatibility. 12:15:32 12:15:32 :param method: 12:15:32 HTTP request method (such as GET, POST, PUT, etc.) 12:15:32 12:15:32 :param url: 12:15:32 The URL to perform the request on. 12:15:32 12:15:32 :param body: 12:15:32 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:15:32 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:15:32 12:15:32 :param headers: 12:15:32 Dictionary of custom headers to send, such as User-Agent, 12:15:32 If-None-Match, etc. If None, pool headers are used. If provided, 12:15:32 these headers completely replace any pool-specific headers. 12:15:32 12:15:32 :param retries: 12:15:32 Configure the number of retries to allow before raising a 12:15:32 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:15:32 12:15:32 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:15:32 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:15:32 over different types of retries. 12:15:32 Pass an integer number to retry connection errors that many times, 12:15:32 but no other types of errors. Pass zero to never retry. 12:15:32 12:15:32 If ``False``, then retries are disabled and any exception is raised 12:15:32 immediately. Also, instead of raising a MaxRetryError on redirects, 12:15:32 the redirect response will be returned. 12:15:32 12:15:32 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:15:32 12:15:32 :param redirect: 12:15:32 If True, automatically handle redirects (status codes 301, 302, 12:15:32 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:15:32 will disable redirect, too. 12:15:32 12:15:32 :param assert_same_host: 12:15:32 If ``True``, will make sure that the host of the pool requests is 12:15:32 consistent else will raise HostChangedError. When ``False``, you can 12:15:32 use the pool on an HTTP proxy and request foreign hosts. 12:15:32 12:15:32 :param timeout: 12:15:32 If specified, overrides the default timeout for this one 12:15:32 request. It may be a float (in seconds) or an instance of 12:15:32 :class:`urllib3.util.Timeout`. 12:15:32 12:15:32 :param pool_timeout: 12:15:32 If set and the pool is set to block=True, then this method will 12:15:32 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:15:32 connection is available within the time period. 12:15:32 12:15:32 :param bool preload_content: 12:15:32 If True, the response's body will be preloaded into memory. 12:15:32 12:15:32 :param bool decode_content: 12:15:32 If True, will attempt to decode the body based on the 12:15:32 'content-encoding' header. 12:15:32 12:15:32 :param release_conn: 12:15:32 If False, then the urlopen call will not release the connection 12:15:32 back into the pool once a response is received (but will release if 12:15:32 you read the entire contents of the response such as when 12:15:32 `preload_content=True`). This is useful if you're not preloading 12:15:32 the response's content immediately. You will need to call 12:15:32 ``r.release_conn()`` on the response ``r`` to return the connection 12:15:32 back into the pool. If None, it takes the value of ``preload_content`` 12:15:32 which defaults to ``True``. 12:15:32 12:15:32 :param bool chunked: 12:15:32 If True, urllib3 will send the body using chunked transfer 12:15:32 encoding. Otherwise, urllib3 will send the body using the standard 12:15:32 content-length form. Defaults to False. 12:15:32 12:15:32 :param int body_pos: 12:15:32 Position to seek to in file-like body in the event of a retry or 12:15:32 redirect. Typically this won't need to be set because urllib3 will 12:15:32 auto-populate the value when needed. 12:15:32 """ 12:15:32 parsed_url = parse_url(url) 12:15:32 destination_scheme = parsed_url.scheme 12:15:32 12:15:32 if headers is None: 12:15:32 headers = self.headers 12:15:32 12:15:32 if not isinstance(retries, Retry): 12:15:32 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:15:32 12:15:32 if release_conn is None: 12:15:32 release_conn = preload_content 12:15:32 12:15:32 # Check host 12:15:32 if assert_same_host and not self.is_same_host(url): 12:15:32 raise HostChangedError(self, url, retries) 12:15:32 12:15:32 # Ensure that the URL we're connecting to is properly encoded 12:15:32 if url.startswith("/"): 12:15:32 url = to_str(_encode_target(url)) 12:15:32 else: 12:15:32 url = to_str(parsed_url.url) 12:15:32 12:15:32 conn = None 12:15:32 12:15:32 # Track whether `conn` needs to be released before 12:15:32 # returning/raising/recursing. Update this variable if necessary, and 12:15:32 # leave `release_conn` constant throughout the function. That way, if 12:15:32 # the function recurses, the original value of `release_conn` will be 12:15:32 # passed down into the recursive call, and its value will be respected. 12:15:32 # 12:15:32 # See issue #651 [1] for details. 12:15:32 # 12:15:32 # [1] 12:15:32 release_this_conn = release_conn 12:15:32 12:15:32 http_tunnel_required = connection_requires_http_tunnel( 12:15:32 self.proxy, self.proxy_config, destination_scheme 12:15:32 ) 12:15:32 12:15:32 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:15:32 # have to copy the headers dict so we can safely change it without those 12:15:32 # changes being reflected in anyone else's copy. 12:15:32 if not http_tunnel_required: 12:15:32 headers = headers.copy() # type: ignore[attr-defined] 12:15:32 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:15:32 12:15:32 # Must keep the exception bound to a separate variable or else Python 3 12:15:32 # complains about UnboundLocalError. 12:15:32 err = None 12:15:32 12:15:32 # Keep track of whether we cleanly exited the except block. This 12:15:32 # ensures we do proper cleanup in finally. 12:15:32 clean_exit = False 12:15:32 12:15:32 # Rewind body position, if needed. Record current position 12:15:32 # for future rewinds in the event of a redirect/retry. 12:15:32 body_pos = set_file_position(body, body_pos) 12:15:32 12:15:32 try: 12:15:32 # Request a connection from the queue. 12:15:32 timeout_obj = self._get_timeout(timeout) 12:15:32 conn = self._get_conn(timeout=pool_timeout) 12:15:32 12:15:32 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:15:32 12:15:32 # Is this a closed/new connection that requires CONNECT tunnelling? 12:15:32 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:15:32 try: 12:15:32 self._prepare_proxy(conn) 12:15:32 except (BaseSSLError, OSError, SocketTimeout) as e: 12:15:32 self._raise_timeout( 12:15:32 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:15:32 ) 12:15:32 raise 12:15:32 12:15:32 # If we're going to release the connection in ``finally:``, then 12:15:32 # the response doesn't need to know about the connection. Otherwise 12:15:32 # it will also try to release it and we'll have a double-release 12:15:32 # mess. 12:15:32 response_conn = conn if not release_conn else None 12:15:32 12:15:32 # Make the request on the HTTPConnection object 12:15:32 > response = self._make_request( 12:15:32 conn, 12:15:32 method, 12:15:32 url, 12:15:32 timeout=timeout_obj, 12:15:32 body=body, 12:15:32 headers=headers, 12:15:32 chunked=chunked, 12:15:32 retries=retries, 12:15:32 response_conn=response_conn, 12:15:32 preload_content=preload_content, 12:15:32 decode_content=decode_content, 12:15:32 **response_kw, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 12:15:32 conn.request( 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 12:15:32 self.endheaders() 12:15:32 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 12:15:32 self._send_output(message_body, encode_chunked=encode_chunked) 12:15:32 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 12:15:32 self.send(msg) 12:15:32 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 12:15:32 self.connect() 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 12:15:32 self.sock = self._new_conn() 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 self = 12:15:32 12:15:32 def _new_conn(self) -> socket.socket: 12:15:32 """Establish a socket connection and set nodelay settings on it. 12:15:32 12:15:32 :return: New socket connection. 12:15:32 """ 12:15:32 try: 12:15:32 sock = connection.create_connection( 12:15:32 (self._dns_host, self.port), 12:15:32 self.timeout, 12:15:32 source_address=self.source_address, 12:15:32 socket_options=self.socket_options, 12:15:32 ) 12:15:32 except socket.gaierror as e: 12:15:32 raise NameResolutionError(self.host, self, e) from e 12:15:32 except SocketTimeout as e: 12:15:32 raise ConnectTimeoutError( 12:15:32 self, 12:15:32 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:15:32 ) from e 12:15:32 12:15:32 except OSError as e: 12:15:32 > raise NewConnectionError( 12:15:32 self, f"Failed to establish a new connection: {e}" 12:15:32 ) from e 12:15:32 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 12:15:32 12:15:32 The above exception was the direct cause of the following exception: 12:15:32 12:15:32 self = 12:15:32 request = , stream = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:32 proxies = OrderedDict() 12:15:32 12:15:32 def send( 12:15:32 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:32 ): 12:15:32 """Sends PreparedRequest object. Returns Response object. 12:15:32 12:15:32 :param request: The :class:`PreparedRequest ` being sent. 12:15:32 :param stream: (optional) Whether to stream the request content. 12:15:32 :param timeout: (optional) How long to wait for the server to send 12:15:32 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:32 read timeout) ` tuple. 12:15:32 :type timeout: float or tuple or urllib3 Timeout object 12:15:32 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:32 we verify the server's TLS certificate, or a string, in which case it 12:15:32 must be a path to a CA bundle to use 12:15:32 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:32 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:32 :rtype: requests.Response 12:15:32 """ 12:15:32 12:15:32 try: 12:15:32 conn = self.get_connection_with_tls_context( 12:15:32 request, verify, proxies=proxies, cert=cert 12:15:32 ) 12:15:32 except LocationValueError as e: 12:15:32 raise InvalidURL(e, request=request) 12:15:32 12:15:32 self.cert_verify(conn, request.url, verify, cert) 12:15:32 url = self.request_url(request, proxies) 12:15:32 self.add_headers( 12:15:32 request, 12:15:32 stream=stream, 12:15:32 timeout=timeout, 12:15:32 verify=verify, 12:15:32 cert=cert, 12:15:32 proxies=proxies, 12:15:32 ) 12:15:32 12:15:32 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:32 12:15:32 if isinstance(timeout, tuple): 12:15:32 try: 12:15:32 connect, read = timeout 12:15:32 timeout = TimeoutSauce(connect=connect, read=read) 12:15:32 except ValueError: 12:15:32 raise ValueError( 12:15:32 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:32 f"or a single float to set both timeouts to the same value." 12:15:32 ) 12:15:32 elif isinstance(timeout, TimeoutSauce): 12:15:32 pass 12:15:32 else: 12:15:32 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:32 12:15:32 try: 12:15:32 > resp = conn.urlopen( 12:15:32 method=request.method, 12:15:32 url=url, 12:15:32 body=request.body, 12:15:32 headers=request.headers, 12:15:32 redirect=False, 12:15:32 assert_same_host=False, 12:15:32 preload_content=False, 12:15:32 decode_content=False, 12:15:32 retries=self.max_retries, 12:15:32 timeout=timeout, 12:15:32 chunked=chunked, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 12:15:32 retries = retries.increment( 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:32 method = 'GET' 12:15:32 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT1' 12:15:32 response = None 12:15:32 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 12:15:32 _pool = 12:15:32 _stacktrace = 12:15:32 12:15:32 def increment( 12:15:32 self, 12:15:32 method: str | None = None, 12:15:32 url: str | None = None, 12:15:32 response: BaseHTTPResponse | None = None, 12:15:32 error: Exception | None = None, 12:15:32 _pool: ConnectionPool | None = None, 12:15:32 _stacktrace: TracebackType | None = None, 12:15:32 ) -> Self: 12:15:32 """Return a new Retry object with incremented retry counters. 12:15:32 12:15:32 :param response: A response object, or None, if the server did not 12:15:32 return a response. 12:15:32 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:15:32 :param Exception error: An error encountered during the request, or 12:15:32 None if the response was received successfully. 12:15:32 12:15:32 :return: A new ``Retry`` object. 12:15:32 """ 12:15:32 if self.total is False and error: 12:15:32 # Disabled, indicate to re-raise the error. 12:15:32 raise reraise(type(error), error, _stacktrace) 12:15:32 12:15:32 total = self.total 12:15:32 if total is not None: 12:15:32 total -= 1 12:15:32 12:15:32 connect = self.connect 12:15:32 read = self.read 12:15:32 redirect = self.redirect 12:15:32 status_count = self.status 12:15:32 other = self.other 12:15:32 cause = "unknown" 12:15:32 status = None 12:15:32 redirect_location = None 12:15:32 12:15:32 if error and self._is_connection_error(error): 12:15:32 # Connect retry? 12:15:32 if connect is False: 12:15:32 raise reraise(type(error), error, _stacktrace) 12:15:32 elif connect is not None: 12:15:32 connect -= 1 12:15:32 12:15:32 elif error and self._is_read_error(error): 12:15:32 # Read retry? 12:15:32 if read is False or method is None or not self._is_method_retryable(method): 12:15:32 raise reraise(type(error), error, _stacktrace) 12:15:32 elif read is not None: 12:15:32 read -= 1 12:15:32 12:15:32 elif error: 12:15:32 # Other retry? 12:15:32 if other is not None: 12:15:32 other -= 1 12:15:32 12:15:32 elif response and response.get_redirect_location(): 12:15:32 # Redirect retry? 12:15:32 if redirect is not None: 12:15:32 redirect -= 1 12:15:32 cause = "too many redirects" 12:15:32 response_redirect_location = response.get_redirect_location() 12:15:32 if response_redirect_location: 12:15:32 redirect_location = response_redirect_location 12:15:32 status = response.status 12:15:32 12:15:32 else: 12:15:32 # Incrementing because of a server error like a 500 in 12:15:32 # status_forcelist and the given method is in the allowed_methods 12:15:32 cause = ResponseError.GENERIC_ERROR 12:15:32 if response and response.status: 12:15:32 if status_count is not None: 12:15:32 status_count -= 1 12:15:32 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:15:32 status = response.status 12:15:32 12:15:32 history = self.history + ( 12:15:32 RequestHistory(method, url, error, status, redirect_location), 12:15:32 ) 12:15:32 12:15:32 new_retry = self.new( 12:15:32 total=total, 12:15:32 connect=connect, 12:15:32 read=read, 12:15:32 redirect=redirect, 12:15:32 status=status_count, 12:15:32 other=other, 12:15:32 history=history, 12:15:32 ) 12:15:32 12:15:32 if new_retry.is_exhausted(): 12:15:32 reason = error or ResponseError(cause) 12:15:32 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:15:32 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT1 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 12:15:32 12:15:32 During handling of the above exception, another exception occurred: 12:15:32 12:15:32 self = 12:15:32 12:15:32 def test_12_xpdr_portmapping_CLIENT1(self): 12:15:32 > response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-CLIENT1") 12:15:32 12:15:32 transportpce_tests/1.2.1/test01_portmapping.py:144: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 transportpce_tests/common/test_utils.py:470: in get_portmapping_node_attr 12:15:32 response = get_request(target_url) 12:15:32 transportpce_tests/common/test_utils.py:116: in get_request 12:15:32 return requests.request( 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:15:32 return session.request(method=method, url=url, **kwargs) 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:15:32 resp = self.send(prep, **send_kwargs) 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:15:32 r = adapter.send(request, **kwargs) 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 self = 12:15:32 request = , stream = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:32 proxies = OrderedDict() 12:15:32 12:15:32 def send( 12:15:32 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:32 ): 12:15:32 """Sends PreparedRequest object. Returns Response object. 12:15:32 12:15:32 :param request: The :class:`PreparedRequest ` being sent. 12:15:32 :param stream: (optional) Whether to stream the request content. 12:15:32 :param timeout: (optional) How long to wait for the server to send 12:15:32 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:32 read timeout) ` tuple. 12:15:32 :type timeout: float or tuple or urllib3 Timeout object 12:15:32 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:32 we verify the server's TLS certificate, or a string, in which case it 12:15:32 must be a path to a CA bundle to use 12:15:32 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:32 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:32 :rtype: requests.Response 12:15:32 """ 12:15:32 12:15:32 try: 12:15:32 conn = self.get_connection_with_tls_context( 12:15:32 request, verify, proxies=proxies, cert=cert 12:15:32 ) 12:15:32 except LocationValueError as e: 12:15:32 raise InvalidURL(e, request=request) 12:15:32 12:15:32 self.cert_verify(conn, request.url, verify, cert) 12:15:32 url = self.request_url(request, proxies) 12:15:32 self.add_headers( 12:15:32 request, 12:15:32 stream=stream, 12:15:32 timeout=timeout, 12:15:32 verify=verify, 12:15:32 cert=cert, 12:15:32 proxies=proxies, 12:15:32 ) 12:15:32 12:15:32 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:32 12:15:32 if isinstance(timeout, tuple): 12:15:32 try: 12:15:32 connect, read = timeout 12:15:32 timeout = TimeoutSauce(connect=connect, read=read) 12:15:32 except ValueError: 12:15:32 raise ValueError( 12:15:32 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:32 f"or a single float to set both timeouts to the same value." 12:15:32 ) 12:15:32 elif isinstance(timeout, TimeoutSauce): 12:15:32 pass 12:15:32 else: 12:15:32 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:32 12:15:32 try: 12:15:32 resp = conn.urlopen( 12:15:32 method=request.method, 12:15:32 url=url, 12:15:32 body=request.body, 12:15:32 headers=request.headers, 12:15:32 redirect=False, 12:15:32 assert_same_host=False, 12:15:32 preload_content=False, 12:15:32 decode_content=False, 12:15:32 retries=self.max_retries, 12:15:32 timeout=timeout, 12:15:32 chunked=chunked, 12:15:32 ) 12:15:32 12:15:32 except (ProtocolError, OSError) as err: 12:15:32 raise ConnectionError(err, request=request) 12:15:32 12:15:32 except MaxRetryError as e: 12:15:32 if isinstance(e.reason, ConnectTimeoutError): 12:15:32 # TODO: Remove this in 3.0.0: see #2811 12:15:32 if not isinstance(e.reason, NewConnectionError): 12:15:32 raise ConnectTimeout(e, request=request) 12:15:32 12:15:32 if isinstance(e.reason, ResponseError): 12:15:32 raise RetryError(e, request=request) 12:15:32 12:15:32 if isinstance(e.reason, _ProxyError): 12:15:32 raise ProxyError(e, request=request) 12:15:32 12:15:32 if isinstance(e.reason, _SSLError): 12:15:32 # This branch is for urllib3 v1.22 and later. 12:15:32 raise SSLError(e, request=request) 12:15:32 12:15:32 > raise ConnectionError(e, request=request) 12:15:32 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT1 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 12:15:32 ----------------------------- Captured stdout call ----------------------------- 12:15:32 execution of test_12_xpdr_portmapping_CLIENT1 12:15:32 _______ TransportPCEPortMappingTesting.test_13_xpdr_portmapping_CLIENT2 ________ 12:15:32 12:15:32 self = 12:15:32 12:15:32 def _new_conn(self) -> socket.socket: 12:15:32 """Establish a socket connection and set nodelay settings on it. 12:15:32 12:15:32 :return: New socket connection. 12:15:32 """ 12:15:32 try: 12:15:32 > sock = connection.create_connection( 12:15:32 (self._dns_host, self.port), 12:15:32 self.timeout, 12:15:32 source_address=self.source_address, 12:15:32 socket_options=self.socket_options, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:15:32 raise err 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 address = ('localhost', 8182), timeout = 10, source_address = None 12:15:32 socket_options = [(6, 1, 1)] 12:15:32 12:15:32 def create_connection( 12:15:32 address: tuple[str, int], 12:15:32 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:32 source_address: tuple[str, int] | None = None, 12:15:32 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:15:32 ) -> socket.socket: 12:15:32 """Connect to *address* and return the socket object. 12:15:32 12:15:32 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:15:32 port)``) and return the socket object. Passing the optional 12:15:32 *timeout* parameter will set the timeout on the socket instance 12:15:32 before attempting to connect. If no *timeout* is supplied, the 12:15:32 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:15:32 is used. If *source_address* is set it must be a tuple of (host, port) 12:15:32 for the socket to bind as a source address before making the connection. 12:15:32 An host of '' or port 0 tells the OS to use the default. 12:15:32 """ 12:15:32 12:15:32 host, port = address 12:15:32 if host.startswith("["): 12:15:32 host = host.strip("[]") 12:15:32 err = None 12:15:32 12:15:32 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:15:32 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:15:32 # The original create_connection function always returns all records. 12:15:32 family = allowed_gai_family() 12:15:32 12:15:32 try: 12:15:32 host.encode("idna") 12:15:32 except UnicodeError: 12:15:32 raise LocationParseError(f"'{host}', label empty or too long") from None 12:15:32 12:15:32 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:15:32 af, socktype, proto, canonname, sa = res 12:15:32 sock = None 12:15:32 try: 12:15:32 sock = socket.socket(af, socktype, proto) 12:15:32 12:15:32 # If provided, set socket level options before connecting. 12:15:32 _set_socket_options(sock, socket_options) 12:15:32 12:15:32 if timeout is not _DEFAULT_TIMEOUT: 12:15:32 sock.settimeout(timeout) 12:15:32 if source_address: 12:15:32 sock.bind(source_address) 12:15:32 > sock.connect(sa) 12:15:32 E ConnectionRefusedError: [Errno 111] Connection refused 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:15:32 12:15:32 The above exception was the direct cause of the following exception: 12:15:32 12:15:32 self = 12:15:32 method = 'GET' 12:15:32 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT2' 12:15:32 body = None 12:15:32 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:15:32 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:32 redirect = False, assert_same_host = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 12:15:32 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:15:32 decode_content = False, response_kw = {} 12:15:32 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT2', query=None, fragment=None) 12:15:32 destination_scheme = None, conn = None, release_this_conn = True 12:15:32 http_tunnel_required = False, err = None, clean_exit = False 12:15:32 12:15:32 def urlopen( # type: ignore[override] 12:15:32 self, 12:15:32 method: str, 12:15:32 url: str, 12:15:32 body: _TYPE_BODY | None = None, 12:15:32 headers: typing.Mapping[str, str] | None = None, 12:15:32 retries: Retry | bool | int | None = None, 12:15:32 redirect: bool = True, 12:15:32 assert_same_host: bool = True, 12:15:32 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:32 pool_timeout: int | None = None, 12:15:32 release_conn: bool | None = None, 12:15:32 chunked: bool = False, 12:15:32 body_pos: _TYPE_BODY_POSITION | None = None, 12:15:32 preload_content: bool = True, 12:15:32 decode_content: bool = True, 12:15:32 **response_kw: typing.Any, 12:15:32 ) -> BaseHTTPResponse: 12:15:32 """ 12:15:32 Get a connection from the pool and perform an HTTP request. This is the 12:15:32 lowest level call for making a request, so you'll need to specify all 12:15:32 the raw details. 12:15:32 12:15:32 .. note:: 12:15:32 12:15:32 More commonly, it's appropriate to use a convenience method 12:15:32 such as :meth:`request`. 12:15:32 12:15:32 .. note:: 12:15:32 12:15:32 `release_conn` will only behave as expected if 12:15:32 `preload_content=False` because we want to make 12:15:32 `preload_content=False` the default behaviour someday soon without 12:15:32 breaking backwards compatibility. 12:15:32 12:15:32 :param method: 12:15:32 HTTP request method (such as GET, POST, PUT, etc.) 12:15:32 12:15:32 :param url: 12:15:32 The URL to perform the request on. 12:15:32 12:15:32 :param body: 12:15:32 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:15:32 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:15:32 12:15:32 :param headers: 12:15:32 Dictionary of custom headers to send, such as User-Agent, 12:15:32 If-None-Match, etc. If None, pool headers are used. If provided, 12:15:32 these headers completely replace any pool-specific headers. 12:15:32 12:15:32 :param retries: 12:15:32 Configure the number of retries to allow before raising a 12:15:32 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:15:32 12:15:32 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:15:32 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:15:32 over different types of retries. 12:15:32 Pass an integer number to retry connection errors that many times, 12:15:32 but no other types of errors. Pass zero to never retry. 12:15:32 12:15:32 If ``False``, then retries are disabled and any exception is raised 12:15:32 immediately. Also, instead of raising a MaxRetryError on redirects, 12:15:32 the redirect response will be returned. 12:15:32 12:15:32 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:15:32 12:15:32 :param redirect: 12:15:32 If True, automatically handle redirects (status codes 301, 302, 12:15:32 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:15:32 will disable redirect, too. 12:15:32 12:15:32 :param assert_same_host: 12:15:32 If ``True``, will make sure that the host of the pool requests is 12:15:32 consistent else will raise HostChangedError. When ``False``, you can 12:15:32 use the pool on an HTTP proxy and request foreign hosts. 12:15:32 12:15:32 :param timeout: 12:15:32 If specified, overrides the default timeout for this one 12:15:32 request. It may be a float (in seconds) or an instance of 12:15:32 :class:`urllib3.util.Timeout`. 12:15:32 12:15:32 :param pool_timeout: 12:15:32 If set and the pool is set to block=True, then this method will 12:15:32 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:15:32 connection is available within the time period. 12:15:32 12:15:32 :param bool preload_content: 12:15:32 If True, the response's body will be preloaded into memory. 12:15:32 12:15:32 :param bool decode_content: 12:15:32 If True, will attempt to decode the body based on the 12:15:32 'content-encoding' header. 12:15:32 12:15:32 :param release_conn: 12:15:32 If False, then the urlopen call will not release the connection 12:15:32 back into the pool once a response is received (but will release if 12:15:32 you read the entire contents of the response such as when 12:15:32 `preload_content=True`). This is useful if you're not preloading 12:15:32 the response's content immediately. You will need to call 12:15:32 ``r.release_conn()`` on the response ``r`` to return the connection 12:15:32 back into the pool. If None, it takes the value of ``preload_content`` 12:15:32 which defaults to ``True``. 12:15:32 12:15:32 :param bool chunked: 12:15:32 If True, urllib3 will send the body using chunked transfer 12:15:32 encoding. Otherwise, urllib3 will send the body using the standard 12:15:32 content-length form. Defaults to False. 12:15:32 12:15:32 :param int body_pos: 12:15:32 Position to seek to in file-like body in the event of a retry or 12:15:32 redirect. Typically this won't need to be set because urllib3 will 12:15:32 auto-populate the value when needed. 12:15:32 """ 12:15:32 parsed_url = parse_url(url) 12:15:32 destination_scheme = parsed_url.scheme 12:15:32 12:15:32 if headers is None: 12:15:32 headers = self.headers 12:15:32 12:15:32 if not isinstance(retries, Retry): 12:15:32 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:15:32 12:15:32 if release_conn is None: 12:15:32 release_conn = preload_content 12:15:32 12:15:32 # Check host 12:15:32 if assert_same_host and not self.is_same_host(url): 12:15:32 raise HostChangedError(self, url, retries) 12:15:32 12:15:32 # Ensure that the URL we're connecting to is properly encoded 12:15:32 if url.startswith("/"): 12:15:32 url = to_str(_encode_target(url)) 12:15:32 else: 12:15:32 url = to_str(parsed_url.url) 12:15:32 12:15:32 conn = None 12:15:32 12:15:32 # Track whether `conn` needs to be released before 12:15:32 # returning/raising/recursing. Update this variable if necessary, and 12:15:32 # leave `release_conn` constant throughout the function. That way, if 12:15:32 # the function recurses, the original value of `release_conn` will be 12:15:32 # passed down into the recursive call, and its value will be respected. 12:15:32 # 12:15:32 # See issue #651 [1] for details. 12:15:32 # 12:15:32 # [1] 12:15:32 release_this_conn = release_conn 12:15:32 12:15:32 http_tunnel_required = connection_requires_http_tunnel( 12:15:32 self.proxy, self.proxy_config, destination_scheme 12:15:32 ) 12:15:32 12:15:32 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:15:32 # have to copy the headers dict so we can safely change it without those 12:15:32 # changes being reflected in anyone else's copy. 12:15:32 if not http_tunnel_required: 12:15:32 headers = headers.copy() # type: ignore[attr-defined] 12:15:32 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:15:32 12:15:32 # Must keep the exception bound to a separate variable or else Python 3 12:15:32 # complains about UnboundLocalError. 12:15:32 err = None 12:15:32 12:15:32 # Keep track of whether we cleanly exited the except block. This 12:15:32 # ensures we do proper cleanup in finally. 12:15:32 clean_exit = False 12:15:32 12:15:32 # Rewind body position, if needed. Record current position 12:15:32 # for future rewinds in the event of a redirect/retry. 12:15:32 body_pos = set_file_position(body, body_pos) 12:15:32 12:15:32 try: 12:15:32 # Request a connection from the queue. 12:15:32 timeout_obj = self._get_timeout(timeout) 12:15:32 conn = self._get_conn(timeout=pool_timeout) 12:15:32 12:15:32 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:15:32 12:15:32 # Is this a closed/new connection that requires CONNECT tunnelling? 12:15:32 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:15:32 try: 12:15:32 self._prepare_proxy(conn) 12:15:32 except (BaseSSLError, OSError, SocketTimeout) as e: 12:15:32 self._raise_timeout( 12:15:32 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:15:32 ) 12:15:32 raise 12:15:32 12:15:32 # If we're going to release the connection in ``finally:``, then 12:15:32 # the response doesn't need to know about the connection. Otherwise 12:15:32 # it will also try to release it and we'll have a double-release 12:15:32 # mess. 12:15:32 response_conn = conn if not release_conn else None 12:15:32 12:15:32 # Make the request on the HTTPConnection object 12:15:32 > response = self._make_request( 12:15:32 conn, 12:15:32 method, 12:15:32 url, 12:15:32 timeout=timeout_obj, 12:15:32 body=body, 12:15:32 headers=headers, 12:15:32 chunked=chunked, 12:15:32 retries=retries, 12:15:32 response_conn=response_conn, 12:15:32 preload_content=preload_content, 12:15:32 decode_content=decode_content, 12:15:32 **response_kw, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 12:15:32 conn.request( 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 12:15:32 self.endheaders() 12:15:32 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 12:15:32 self._send_output(message_body, encode_chunked=encode_chunked) 12:15:32 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 12:15:32 self.send(msg) 12:15:32 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 12:15:32 self.connect() 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 12:15:32 self.sock = self._new_conn() 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 self = 12:15:32 12:15:32 def _new_conn(self) -> socket.socket: 12:15:32 """Establish a socket connection and set nodelay settings on it. 12:15:32 12:15:32 :return: New socket connection. 12:15:32 """ 12:15:32 try: 12:15:32 sock = connection.create_connection( 12:15:32 (self._dns_host, self.port), 12:15:32 self.timeout, 12:15:32 source_address=self.source_address, 12:15:32 socket_options=self.socket_options, 12:15:32 ) 12:15:32 except socket.gaierror as e: 12:15:32 raise NameResolutionError(self.host, self, e) from e 12:15:32 except SocketTimeout as e: 12:15:32 raise ConnectTimeoutError( 12:15:32 self, 12:15:32 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:15:32 ) from e 12:15:32 12:15:32 except OSError as e: 12:15:32 > raise NewConnectionError( 12:15:32 self, f"Failed to establish a new connection: {e}" 12:15:32 ) from e 12:15:32 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 12:15:32 12:15:32 The above exception was the direct cause of the following exception: 12:15:32 12:15:32 self = 12:15:32 request = , stream = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:32 proxies = OrderedDict() 12:15:32 12:15:32 def send( 12:15:32 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:32 ): 12:15:32 """Sends PreparedRequest object. Returns Response object. 12:15:32 12:15:32 :param request: The :class:`PreparedRequest ` being sent. 12:15:32 :param stream: (optional) Whether to stream the request content. 12:15:32 :param timeout: (optional) How long to wait for the server to send 12:15:32 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:32 read timeout) ` tuple. 12:15:32 :type timeout: float or tuple or urllib3 Timeout object 12:15:32 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:32 we verify the server's TLS certificate, or a string, in which case it 12:15:32 must be a path to a CA bundle to use 12:15:32 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:32 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:32 :rtype: requests.Response 12:15:32 """ 12:15:32 12:15:32 try: 12:15:32 conn = self.get_connection_with_tls_context( 12:15:32 request, verify, proxies=proxies, cert=cert 12:15:32 ) 12:15:32 except LocationValueError as e: 12:15:32 raise InvalidURL(e, request=request) 12:15:32 12:15:32 self.cert_verify(conn, request.url, verify, cert) 12:15:32 url = self.request_url(request, proxies) 12:15:32 self.add_headers( 12:15:32 request, 12:15:32 stream=stream, 12:15:32 timeout=timeout, 12:15:32 verify=verify, 12:15:32 cert=cert, 12:15:32 proxies=proxies, 12:15:32 ) 12:15:32 12:15:32 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:32 12:15:32 if isinstance(timeout, tuple): 12:15:32 try: 12:15:32 connect, read = timeout 12:15:32 timeout = TimeoutSauce(connect=connect, read=read) 12:15:32 except ValueError: 12:15:32 raise ValueError( 12:15:32 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:32 f"or a single float to set both timeouts to the same value." 12:15:32 ) 12:15:32 elif isinstance(timeout, TimeoutSauce): 12:15:32 pass 12:15:32 else: 12:15:32 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:32 12:15:32 try: 12:15:32 > resp = conn.urlopen( 12:15:32 method=request.method, 12:15:32 url=url, 12:15:32 body=request.body, 12:15:32 headers=request.headers, 12:15:32 redirect=False, 12:15:32 assert_same_host=False, 12:15:32 preload_content=False, 12:15:32 decode_content=False, 12:15:32 retries=self.max_retries, 12:15:32 timeout=timeout, 12:15:32 chunked=chunked, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 12:15:32 retries = retries.increment( 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:32 method = 'GET' 12:15:32 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT2' 12:15:32 response = None 12:15:32 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 12:15:32 _pool = 12:15:32 _stacktrace = 12:15:32 12:15:32 def increment( 12:15:32 self, 12:15:32 method: str | None = None, 12:15:32 url: str | None = None, 12:15:32 response: BaseHTTPResponse | None = None, 12:15:32 error: Exception | None = None, 12:15:32 _pool: ConnectionPool | None = None, 12:15:32 _stacktrace: TracebackType | None = None, 12:15:32 ) -> Self: 12:15:32 """Return a new Retry object with incremented retry counters. 12:15:32 12:15:32 :param response: A response object, or None, if the server did not 12:15:32 return a response. 12:15:32 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:15:32 :param Exception error: An error encountered during the request, or 12:15:32 None if the response was received successfully. 12:15:32 12:15:32 :return: A new ``Retry`` object. 12:15:32 """ 12:15:32 if self.total is False and error: 12:15:32 # Disabled, indicate to re-raise the error. 12:15:32 raise reraise(type(error), error, _stacktrace) 12:15:32 12:15:32 total = self.total 12:15:32 if total is not None: 12:15:32 total -= 1 12:15:32 12:15:32 connect = self.connect 12:15:32 read = self.read 12:15:32 redirect = self.redirect 12:15:32 status_count = self.status 12:15:32 other = self.other 12:15:32 cause = "unknown" 12:15:32 status = None 12:15:32 redirect_location = None 12:15:32 12:15:32 if error and self._is_connection_error(error): 12:15:32 # Connect retry? 12:15:32 if connect is False: 12:15:32 raise reraise(type(error), error, _stacktrace) 12:15:32 elif connect is not None: 12:15:32 connect -= 1 12:15:32 12:15:32 elif error and self._is_read_error(error): 12:15:32 # Read retry? 12:15:32 if read is False or method is None or not self._is_method_retryable(method): 12:15:32 raise reraise(type(error), error, _stacktrace) 12:15:32 elif read is not None: 12:15:32 read -= 1 12:15:32 12:15:32 elif error: 12:15:32 # Other retry? 12:15:32 if other is not None: 12:15:32 other -= 1 12:15:32 12:15:32 elif response and response.get_redirect_location(): 12:15:32 # Redirect retry? 12:15:32 if redirect is not None: 12:15:32 redirect -= 1 12:15:32 cause = "too many redirects" 12:15:32 response_redirect_location = response.get_redirect_location() 12:15:32 if response_redirect_location: 12:15:32 redirect_location = response_redirect_location 12:15:32 status = response.status 12:15:32 12:15:32 else: 12:15:32 # Incrementing because of a server error like a 500 in 12:15:32 # status_forcelist and the given method is in the allowed_methods 12:15:32 cause = ResponseError.GENERIC_ERROR 12:15:32 if response and response.status: 12:15:32 if status_count is not None: 12:15:32 status_count -= 1 12:15:32 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:15:32 status = response.status 12:15:32 12:15:32 history = self.history + ( 12:15:32 RequestHistory(method, url, error, status, redirect_location), 12:15:32 ) 12:15:32 12:15:32 new_retry = self.new( 12:15:32 total=total, 12:15:32 connect=connect, 12:15:32 read=read, 12:15:32 redirect=redirect, 12:15:32 status=status_count, 12:15:32 other=other, 12:15:32 history=history, 12:15:32 ) 12:15:32 12:15:32 if new_retry.is_exhausted(): 12:15:32 reason = error or ResponseError(cause) 12:15:32 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:15:32 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT2 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 12:15:32 12:15:32 During handling of the above exception, another exception occurred: 12:15:32 12:15:32 self = 12:15:32 12:15:32 def test_13_xpdr_portmapping_CLIENT2(self): 12:15:32 > response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-CLIENT2") 12:15:32 12:15:32 transportpce_tests/1.2.1/test01_portmapping.py:156: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 transportpce_tests/common/test_utils.py:470: in get_portmapping_node_attr 12:15:32 response = get_request(target_url) 12:15:32 transportpce_tests/common/test_utils.py:116: in get_request 12:15:32 return requests.request( 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:15:32 return session.request(method=method, url=url, **kwargs) 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:15:32 resp = self.send(prep, **send_kwargs) 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:15:32 r = adapter.send(request, **kwargs) 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 self = 12:15:32 request = , stream = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:32 proxies = OrderedDict() 12:15:32 12:15:32 def send( 12:15:32 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:32 ): 12:15:32 """Sends PreparedRequest object. Returns Response object. 12:15:32 12:15:32 :param request: The :class:`PreparedRequest ` being sent. 12:15:32 :param stream: (optional) Whether to stream the request content. 12:15:32 :param timeout: (optional) How long to wait for the server to send 12:15:32 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:32 read timeout) ` tuple. 12:15:32 :type timeout: float or tuple or urllib3 Timeout object 12:15:32 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:32 we verify the server's TLS certificate, or a string, in which case it 12:15:32 must be a path to a CA bundle to use 12:15:32 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:32 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:32 :rtype: requests.Response 12:15:32 """ 12:15:32 12:15:32 try: 12:15:32 conn = self.get_connection_with_tls_context( 12:15:32 request, verify, proxies=proxies, cert=cert 12:15:32 ) 12:15:32 except LocationValueError as e: 12:15:32 raise InvalidURL(e, request=request) 12:15:32 12:15:32 self.cert_verify(conn, request.url, verify, cert) 12:15:32 url = self.request_url(request, proxies) 12:15:32 self.add_headers( 12:15:32 request, 12:15:32 stream=stream, 12:15:32 timeout=timeout, 12:15:32 verify=verify, 12:15:32 cert=cert, 12:15:32 proxies=proxies, 12:15:32 ) 12:15:32 12:15:32 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:32 12:15:32 if isinstance(timeout, tuple): 12:15:32 try: 12:15:32 connect, read = timeout 12:15:32 timeout = TimeoutSauce(connect=connect, read=read) 12:15:32 except ValueError: 12:15:32 raise ValueError( 12:15:32 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:32 f"or a single float to set both timeouts to the same value." 12:15:32 ) 12:15:32 elif isinstance(timeout, TimeoutSauce): 12:15:32 pass 12:15:32 else: 12:15:32 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:32 12:15:32 try: 12:15:32 resp = conn.urlopen( 12:15:32 method=request.method, 12:15:32 url=url, 12:15:32 body=request.body, 12:15:32 headers=request.headers, 12:15:32 redirect=False, 12:15:32 assert_same_host=False, 12:15:32 preload_content=False, 12:15:32 decode_content=False, 12:15:32 retries=self.max_retries, 12:15:32 timeout=timeout, 12:15:32 chunked=chunked, 12:15:32 ) 12:15:32 12:15:32 except (ProtocolError, OSError) as err: 12:15:32 raise ConnectionError(err, request=request) 12:15:32 12:15:32 except MaxRetryError as e: 12:15:32 if isinstance(e.reason, ConnectTimeoutError): 12:15:32 # TODO: Remove this in 3.0.0: see #2811 12:15:32 if not isinstance(e.reason, NewConnectionError): 12:15:32 raise ConnectTimeout(e, request=request) 12:15:32 12:15:32 if isinstance(e.reason, ResponseError): 12:15:32 raise RetryError(e, request=request) 12:15:32 12:15:32 if isinstance(e.reason, _ProxyError): 12:15:32 raise ProxyError(e, request=request) 12:15:32 12:15:32 if isinstance(e.reason, _SSLError): 12:15:32 # This branch is for urllib3 v1.22 and later. 12:15:32 raise SSLError(e, request=request) 12:15:32 12:15:32 > raise ConnectionError(e, request=request) 12:15:32 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT2 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 12:15:32 ----------------------------- Captured stdout call ----------------------------- 12:15:32 execution of test_13_xpdr_portmapping_CLIENT2 12:15:32 _______ TransportPCEPortMappingTesting.test_14_xpdr_portmapping_CLIENT3 ________ 12:15:32 12:15:32 self = 12:15:32 12:15:32 def _new_conn(self) -> socket.socket: 12:15:32 """Establish a socket connection and set nodelay settings on it. 12:15:32 12:15:32 :return: New socket connection. 12:15:32 """ 12:15:32 try: 12:15:32 > sock = connection.create_connection( 12:15:32 (self._dns_host, self.port), 12:15:32 self.timeout, 12:15:32 source_address=self.source_address, 12:15:32 socket_options=self.socket_options, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:15:32 raise err 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 address = ('localhost', 8182), timeout = 10, source_address = None 12:15:32 socket_options = [(6, 1, 1)] 12:15:32 12:15:32 def create_connection( 12:15:32 address: tuple[str, int], 12:15:32 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:32 source_address: tuple[str, int] | None = None, 12:15:32 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:15:32 ) -> socket.socket: 12:15:32 """Connect to *address* and return the socket object. 12:15:32 12:15:32 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:15:32 port)``) and return the socket object. Passing the optional 12:15:32 *timeout* parameter will set the timeout on the socket instance 12:15:32 before attempting to connect. If no *timeout* is supplied, the 12:15:32 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:15:32 is used. If *source_address* is set it must be a tuple of (host, port) 12:15:32 for the socket to bind as a source address before making the connection. 12:15:32 An host of '' or port 0 tells the OS to use the default. 12:15:32 """ 12:15:32 12:15:32 host, port = address 12:15:32 if host.startswith("["): 12:15:32 host = host.strip("[]") 12:15:32 err = None 12:15:32 12:15:32 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:15:32 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:15:32 # The original create_connection function always returns all records. 12:15:32 family = allowed_gai_family() 12:15:32 12:15:32 try: 12:15:32 host.encode("idna") 12:15:32 except UnicodeError: 12:15:32 raise LocationParseError(f"'{host}', label empty or too long") from None 12:15:32 12:15:32 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:15:32 af, socktype, proto, canonname, sa = res 12:15:32 sock = None 12:15:32 try: 12:15:32 sock = socket.socket(af, socktype, proto) 12:15:32 12:15:32 # If provided, set socket level options before connecting. 12:15:32 _set_socket_options(sock, socket_options) 12:15:32 12:15:32 if timeout is not _DEFAULT_TIMEOUT: 12:15:32 sock.settimeout(timeout) 12:15:32 if source_address: 12:15:32 sock.bind(source_address) 12:15:32 > sock.connect(sa) 12:15:32 E ConnectionRefusedError: [Errno 111] Connection refused 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:15:32 12:15:32 The above exception was the direct cause of the following exception: 12:15:32 12:15:32 self = 12:15:32 method = 'GET' 12:15:32 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT3' 12:15:32 body = None 12:15:32 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:15:32 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:32 redirect = False, assert_same_host = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 12:15:32 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:15:32 decode_content = False, response_kw = {} 12:15:32 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT3', query=None, fragment=None) 12:15:32 destination_scheme = None, conn = None, release_this_conn = True 12:15:32 http_tunnel_required = False, err = None, clean_exit = False 12:15:32 12:15:32 def urlopen( # type: ignore[override] 12:15:32 self, 12:15:32 method: str, 12:15:32 url: str, 12:15:32 body: _TYPE_BODY | None = None, 12:15:32 headers: typing.Mapping[str, str] | None = None, 12:15:32 retries: Retry | bool | int | None = None, 12:15:32 redirect: bool = True, 12:15:32 assert_same_host: bool = True, 12:15:32 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:32 pool_timeout: int | None = None, 12:15:32 release_conn: bool | None = None, 12:15:32 chunked: bool = False, 12:15:32 body_pos: _TYPE_BODY_POSITION | None = None, 12:15:32 preload_content: bool = True, 12:15:32 decode_content: bool = True, 12:15:32 **response_kw: typing.Any, 12:15:32 ) -> BaseHTTPResponse: 12:15:32 """ 12:15:32 Get a connection from the pool and perform an HTTP request. This is the 12:15:32 lowest level call for making a request, so you'll need to specify all 12:15:32 the raw details. 12:15:32 12:15:32 .. note:: 12:15:32 12:15:32 More commonly, it's appropriate to use a convenience method 12:15:32 such as :meth:`request`. 12:15:32 12:15:32 .. note:: 12:15:32 12:15:32 `release_conn` will only behave as expected if 12:15:32 `preload_content=False` because we want to make 12:15:32 `preload_content=False` the default behaviour someday soon without 12:15:32 breaking backwards compatibility. 12:15:32 12:15:32 :param method: 12:15:32 HTTP request method (such as GET, POST, PUT, etc.) 12:15:32 12:15:32 :param url: 12:15:32 The URL to perform the request on. 12:15:32 12:15:32 :param body: 12:15:32 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:15:32 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:15:32 12:15:32 :param headers: 12:15:32 Dictionary of custom headers to send, such as User-Agent, 12:15:32 If-None-Match, etc. If None, pool headers are used. If provided, 12:15:32 these headers completely replace any pool-specific headers. 12:15:32 12:15:32 :param retries: 12:15:32 Configure the number of retries to allow before raising a 12:15:32 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:15:32 12:15:32 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:15:32 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:15:32 over different types of retries. 12:15:32 Pass an integer number to retry connection errors that many times, 12:15:32 but no other types of errors. Pass zero to never retry. 12:15:32 12:15:32 If ``False``, then retries are disabled and any exception is raised 12:15:32 immediately. Also, instead of raising a MaxRetryError on redirects, 12:15:32 the redirect response will be returned. 12:15:32 12:15:32 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:15:32 12:15:32 :param redirect: 12:15:32 If True, automatically handle redirects (status codes 301, 302, 12:15:32 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:15:32 will disable redirect, too. 12:15:32 12:15:32 :param assert_same_host: 12:15:32 If ``True``, will make sure that the host of the pool requests is 12:15:32 consistent else will raise HostChangedError. When ``False``, you can 12:15:32 use the pool on an HTTP proxy and request foreign hosts. 12:15:32 12:15:32 :param timeout: 12:15:32 If specified, overrides the default timeout for this one 12:15:32 request. It may be a float (in seconds) or an instance of 12:15:32 :class:`urllib3.util.Timeout`. 12:15:32 12:15:32 :param pool_timeout: 12:15:32 If set and the pool is set to block=True, then this method will 12:15:32 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:15:32 connection is available within the time period. 12:15:32 12:15:32 :param bool preload_content: 12:15:32 If True, the response's body will be preloaded into memory. 12:15:32 12:15:32 :param bool decode_content: 12:15:32 If True, will attempt to decode the body based on the 12:15:32 'content-encoding' header. 12:15:32 12:15:32 :param release_conn: 12:15:32 If False, then the urlopen call will not release the connection 12:15:32 back into the pool once a response is received (but will release if 12:15:32 you read the entire contents of the response such as when 12:15:32 `preload_content=True`). This is useful if you're not preloading 12:15:32 the response's content immediately. You will need to call 12:15:32 ``r.release_conn()`` on the response ``r`` to return the connection 12:15:32 back into the pool. If None, it takes the value of ``preload_content`` 12:15:32 which defaults to ``True``. 12:15:32 12:15:32 :param bool chunked: 12:15:32 If True, urllib3 will send the body using chunked transfer 12:15:32 encoding. Otherwise, urllib3 will send the body using the standard 12:15:32 content-length form. Defaults to False. 12:15:32 12:15:32 :param int body_pos: 12:15:32 Position to seek to in file-like body in the event of a retry or 12:15:32 redirect. Typically this won't need to be set because urllib3 will 12:15:32 auto-populate the value when needed. 12:15:32 """ 12:15:32 parsed_url = parse_url(url) 12:15:32 destination_scheme = parsed_url.scheme 12:15:32 12:15:32 if headers is None: 12:15:32 headers = self.headers 12:15:32 12:15:32 if not isinstance(retries, Retry): 12:15:32 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:15:32 12:15:32 if release_conn is None: 12:15:32 release_conn = preload_content 12:15:32 12:15:32 # Check host 12:15:32 if assert_same_host and not self.is_same_host(url): 12:15:32 raise HostChangedError(self, url, retries) 12:15:32 12:15:32 # Ensure that the URL we're connecting to is properly encoded 12:15:32 if url.startswith("/"): 12:15:32 url = to_str(_encode_target(url)) 12:15:32 else: 12:15:32 url = to_str(parsed_url.url) 12:15:32 12:15:32 conn = None 12:15:32 12:15:32 # Track whether `conn` needs to be released before 12:15:32 # returning/raising/recursing. Update this variable if necessary, and 12:15:32 # leave `release_conn` constant throughout the function. That way, if 12:15:32 # the function recurses, the original value of `release_conn` will be 12:15:32 # passed down into the recursive call, and its value will be respected. 12:15:32 # 12:15:32 # See issue #651 [1] for details. 12:15:32 # 12:15:32 # [1] 12:15:32 release_this_conn = release_conn 12:15:32 12:15:32 http_tunnel_required = connection_requires_http_tunnel( 12:15:32 self.proxy, self.proxy_config, destination_scheme 12:15:32 ) 12:15:32 12:15:32 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:15:32 # have to copy the headers dict so we can safely change it without those 12:15:32 # changes being reflected in anyone else's copy. 12:15:32 if not http_tunnel_required: 12:15:32 headers = headers.copy() # type: ignore[attr-defined] 12:15:32 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:15:32 12:15:32 # Must keep the exception bound to a separate variable or else Python 3 12:15:32 # complains about UnboundLocalError. 12:15:32 err = None 12:15:32 12:15:32 # Keep track of whether we cleanly exited the except block. This 12:15:32 # ensures we do proper cleanup in finally. 12:15:32 clean_exit = False 12:15:32 12:15:32 # Rewind body position, if needed. Record current position 12:15:32 # for future rewinds in the event of a redirect/retry. 12:15:32 body_pos = set_file_position(body, body_pos) 12:15:32 12:15:32 try: 12:15:32 # Request a connection from the queue. 12:15:32 timeout_obj = self._get_timeout(timeout) 12:15:32 conn = self._get_conn(timeout=pool_timeout) 12:15:32 12:15:32 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:15:32 12:15:32 # Is this a closed/new connection that requires CONNECT tunnelling? 12:15:32 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:15:32 try: 12:15:32 self._prepare_proxy(conn) 12:15:32 except (BaseSSLError, OSError, SocketTimeout) as e: 12:15:32 self._raise_timeout( 12:15:32 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:15:32 ) 12:15:32 raise 12:15:32 12:15:32 # If we're going to release the connection in ``finally:``, then 12:15:32 # the response doesn't need to know about the connection. Otherwise 12:15:32 # it will also try to release it and we'll have a double-release 12:15:32 # mess. 12:15:32 response_conn = conn if not release_conn else None 12:15:32 12:15:32 # Make the request on the HTTPConnection object 12:15:32 > response = self._make_request( 12:15:32 conn, 12:15:32 method, 12:15:32 url, 12:15:32 timeout=timeout_obj, 12:15:32 body=body, 12:15:32 headers=headers, 12:15:32 chunked=chunked, 12:15:32 retries=retries, 12:15:32 response_conn=response_conn, 12:15:32 preload_content=preload_content, 12:15:32 decode_content=decode_content, 12:15:32 **response_kw, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 12:15:32 conn.request( 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 12:15:32 self.endheaders() 12:15:32 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 12:15:32 self._send_output(message_body, encode_chunked=encode_chunked) 12:15:32 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 12:15:32 self.send(msg) 12:15:32 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 12:15:32 self.connect() 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 12:15:32 self.sock = self._new_conn() 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 self = 12:15:32 12:15:32 def _new_conn(self) -> socket.socket: 12:15:32 """Establish a socket connection and set nodelay settings on it. 12:15:32 12:15:32 :return: New socket connection. 12:15:32 """ 12:15:32 try: 12:15:32 sock = connection.create_connection( 12:15:32 (self._dns_host, self.port), 12:15:32 self.timeout, 12:15:32 source_address=self.source_address, 12:15:32 socket_options=self.socket_options, 12:15:32 ) 12:15:32 except socket.gaierror as e: 12:15:32 raise NameResolutionError(self.host, self, e) from e 12:15:32 except SocketTimeout as e: 12:15:32 raise ConnectTimeoutError( 12:15:32 self, 12:15:32 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:15:32 ) from e 12:15:32 12:15:32 except OSError as e: 12:15:32 > raise NewConnectionError( 12:15:32 self, f"Failed to establish a new connection: {e}" 12:15:32 ) from e 12:15:32 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 12:15:32 12:15:32 The above exception was the direct cause of the following exception: 12:15:32 12:15:32 self = 12:15:32 request = , stream = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:32 proxies = OrderedDict() 12:15:32 12:15:32 def send( 12:15:32 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:32 ): 12:15:32 """Sends PreparedRequest object. Returns Response object. 12:15:32 12:15:32 :param request: The :class:`PreparedRequest ` being sent. 12:15:32 :param stream: (optional) Whether to stream the request content. 12:15:32 :param timeout: (optional) How long to wait for the server to send 12:15:32 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:32 read timeout) ` tuple. 12:15:32 :type timeout: float or tuple or urllib3 Timeout object 12:15:32 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:32 we verify the server's TLS certificate, or a string, in which case it 12:15:32 must be a path to a CA bundle to use 12:15:32 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:32 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:32 :rtype: requests.Response 12:15:32 """ 12:15:32 12:15:32 try: 12:15:32 conn = self.get_connection_with_tls_context( 12:15:32 request, verify, proxies=proxies, cert=cert 12:15:32 ) 12:15:32 except LocationValueError as e: 12:15:32 raise InvalidURL(e, request=request) 12:15:32 12:15:32 self.cert_verify(conn, request.url, verify, cert) 12:15:32 url = self.request_url(request, proxies) 12:15:32 self.add_headers( 12:15:32 request, 12:15:32 stream=stream, 12:15:32 timeout=timeout, 12:15:32 verify=verify, 12:15:32 cert=cert, 12:15:32 proxies=proxies, 12:15:32 ) 12:15:32 12:15:32 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:32 12:15:32 if isinstance(timeout, tuple): 12:15:32 try: 12:15:32 connect, read = timeout 12:15:32 timeout = TimeoutSauce(connect=connect, read=read) 12:15:32 except ValueError: 12:15:32 raise ValueError( 12:15:32 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:32 f"or a single float to set both timeouts to the same value." 12:15:32 ) 12:15:32 elif isinstance(timeout, TimeoutSauce): 12:15:32 pass 12:15:32 else: 12:15:32 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:32 12:15:32 try: 12:15:32 > resp = conn.urlopen( 12:15:32 method=request.method, 12:15:32 url=url, 12:15:32 body=request.body, 12:15:32 headers=request.headers, 12:15:32 redirect=False, 12:15:32 assert_same_host=False, 12:15:32 preload_content=False, 12:15:32 decode_content=False, 12:15:32 retries=self.max_retries, 12:15:32 timeout=timeout, 12:15:32 chunked=chunked, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 12:15:32 retries = retries.increment( 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:32 method = 'GET' 12:15:32 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT3' 12:15:32 response = None 12:15:32 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 12:15:32 _pool = 12:15:32 _stacktrace = 12:15:32 12:15:32 def increment( 12:15:32 self, 12:15:32 method: str | None = None, 12:15:32 url: str | None = None, 12:15:32 response: BaseHTTPResponse | None = None, 12:15:32 error: Exception | None = None, 12:15:32 _pool: ConnectionPool | None = None, 12:15:32 _stacktrace: TracebackType | None = None, 12:15:32 ) -> Self: 12:15:32 """Return a new Retry object with incremented retry counters. 12:15:32 12:15:32 :param response: A response object, or None, if the server did not 12:15:32 return a response. 12:15:32 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:15:32 :param Exception error: An error encountered during the request, or 12:15:32 None if the response was received successfully. 12:15:32 12:15:32 :return: A new ``Retry`` object. 12:15:32 """ 12:15:32 if self.total is False and error: 12:15:32 # Disabled, indicate to re-raise the error. 12:15:32 raise reraise(type(error), error, _stacktrace) 12:15:32 12:15:32 total = self.total 12:15:32 if total is not None: 12:15:32 total -= 1 12:15:32 12:15:32 connect = self.connect 12:15:32 read = self.read 12:15:32 redirect = self.redirect 12:15:32 status_count = self.status 12:15:32 other = self.other 12:15:32 cause = "unknown" 12:15:32 status = None 12:15:32 redirect_location = None 12:15:32 12:15:32 if error and self._is_connection_error(error): 12:15:32 # Connect retry? 12:15:32 if connect is False: 12:15:32 raise reraise(type(error), error, _stacktrace) 12:15:32 elif connect is not None: 12:15:32 connect -= 1 12:15:32 12:15:32 elif error and self._is_read_error(error): 12:15:32 # Read retry? 12:15:32 if read is False or method is None or not self._is_method_retryable(method): 12:15:32 raise reraise(type(error), error, _stacktrace) 12:15:32 elif read is not None: 12:15:32 read -= 1 12:15:32 12:15:32 elif error: 12:15:32 # Other retry? 12:15:32 if other is not None: 12:15:32 other -= 1 12:15:32 12:15:32 elif response and response.get_redirect_location(): 12:15:32 # Redirect retry? 12:15:32 if redirect is not None: 12:15:32 redirect -= 1 12:15:32 cause = "too many redirects" 12:15:32 response_redirect_location = response.get_redirect_location() 12:15:32 if response_redirect_location: 12:15:32 redirect_location = response_redirect_location 12:15:32 status = response.status 12:15:32 12:15:32 else: 12:15:32 # Incrementing because of a server error like a 500 in 12:15:32 # status_forcelist and the given method is in the allowed_methods 12:15:32 cause = ResponseError.GENERIC_ERROR 12:15:32 if response and response.status: 12:15:32 if status_count is not None: 12:15:32 status_count -= 1 12:15:32 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:15:32 status = response.status 12:15:32 12:15:32 history = self.history + ( 12:15:32 RequestHistory(method, url, error, status, redirect_location), 12:15:32 ) 12:15:32 12:15:32 new_retry = self.new( 12:15:32 total=total, 12:15:32 connect=connect, 12:15:32 read=read, 12:15:32 redirect=redirect, 12:15:32 status=status_count, 12:15:32 other=other, 12:15:32 history=history, 12:15:32 ) 12:15:32 12:15:32 if new_retry.is_exhausted(): 12:15:32 reason = error or ResponseError(cause) 12:15:32 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:15:32 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT3 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 12:15:32 12:15:32 During handling of the above exception, another exception occurred: 12:15:32 12:15:32 self = 12:15:32 12:15:32 def test_14_xpdr_portmapping_CLIENT3(self): 12:15:32 > response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-CLIENT3") 12:15:32 12:15:32 transportpce_tests/1.2.1/test01_portmapping.py:168: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 transportpce_tests/common/test_utils.py:470: in get_portmapping_node_attr 12:15:32 response = get_request(target_url) 12:15:32 transportpce_tests/common/test_utils.py:116: in get_request 12:15:32 return requests.request( 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:15:32 return session.request(method=method, url=url, **kwargs) 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:15:32 resp = self.send(prep, **send_kwargs) 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:15:32 r = adapter.send(request, **kwargs) 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 self = 12:15:32 request = , stream = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:32 proxies = OrderedDict() 12:15:32 12:15:32 def send( 12:15:32 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:32 ): 12:15:32 """Sends PreparedRequest object. Returns Response object. 12:15:32 12:15:32 :param request: The :class:`PreparedRequest ` being sent. 12:15:32 :param stream: (optional) Whether to stream the request content. 12:15:32 :param timeout: (optional) How long to wait for the server to send 12:15:32 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:32 read timeout) ` tuple. 12:15:32 :type timeout: float or tuple or urllib3 Timeout object 12:15:32 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:32 we verify the server's TLS certificate, or a string, in which case it 12:15:32 must be a path to a CA bundle to use 12:15:32 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:32 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:32 :rtype: requests.Response 12:15:32 """ 12:15:32 12:15:32 try: 12:15:32 conn = self.get_connection_with_tls_context( 12:15:32 request, verify, proxies=proxies, cert=cert 12:15:32 ) 12:15:32 except LocationValueError as e: 12:15:32 raise InvalidURL(e, request=request) 12:15:32 12:15:32 self.cert_verify(conn, request.url, verify, cert) 12:15:32 url = self.request_url(request, proxies) 12:15:32 self.add_headers( 12:15:32 request, 12:15:32 stream=stream, 12:15:32 timeout=timeout, 12:15:32 verify=verify, 12:15:32 cert=cert, 12:15:32 proxies=proxies, 12:15:32 ) 12:15:32 12:15:32 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:32 12:15:32 if isinstance(timeout, tuple): 12:15:32 try: 12:15:32 connect, read = timeout 12:15:32 timeout = TimeoutSauce(connect=connect, read=read) 12:15:32 except ValueError: 12:15:32 raise ValueError( 12:15:32 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:32 f"or a single float to set both timeouts to the same value." 12:15:32 ) 12:15:32 elif isinstance(timeout, TimeoutSauce): 12:15:32 pass 12:15:32 else: 12:15:32 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:32 12:15:32 try: 12:15:32 resp = conn.urlopen( 12:15:32 method=request.method, 12:15:32 url=url, 12:15:32 body=request.body, 12:15:32 headers=request.headers, 12:15:32 redirect=False, 12:15:32 assert_same_host=False, 12:15:32 preload_content=False, 12:15:32 decode_content=False, 12:15:32 retries=self.max_retries, 12:15:32 timeout=timeout, 12:15:32 chunked=chunked, 12:15:32 ) 12:15:32 12:15:32 except (ProtocolError, OSError) as err: 12:15:32 raise ConnectionError(err, request=request) 12:15:32 12:15:32 except MaxRetryError as e: 12:15:32 if isinstance(e.reason, ConnectTimeoutError): 12:15:32 # TODO: Remove this in 3.0.0: see #2811 12:15:32 if not isinstance(e.reason, NewConnectionError): 12:15:32 raise ConnectTimeout(e, request=request) 12:15:32 12:15:32 if isinstance(e.reason, ResponseError): 12:15:32 raise RetryError(e, request=request) 12:15:32 12:15:32 if isinstance(e.reason, _ProxyError): 12:15:32 raise ProxyError(e, request=request) 12:15:32 12:15:32 if isinstance(e.reason, _SSLError): 12:15:32 # This branch is for urllib3 v1.22 and later. 12:15:32 raise SSLError(e, request=request) 12:15:32 12:15:32 > raise ConnectionError(e, request=request) 12:15:32 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT3 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 12:15:32 ----------------------------- Captured stdout call ----------------------------- 12:15:32 execution of test_14_xpdr_portmapping_CLIENT3 12:15:32 _______ TransportPCEPortMappingTesting.test_15_xpdr_portmapping_CLIENT4 ________ 12:15:32 12:15:32 self = 12:15:32 12:15:32 def _new_conn(self) -> socket.socket: 12:15:32 """Establish a socket connection and set nodelay settings on it. 12:15:32 12:15:32 :return: New socket connection. 12:15:32 """ 12:15:32 try: 12:15:32 > sock = connection.create_connection( 12:15:32 (self._dns_host, self.port), 12:15:32 self.timeout, 12:15:32 source_address=self.source_address, 12:15:32 socket_options=self.socket_options, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:15:32 raise err 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 address = ('localhost', 8182), timeout = 10, source_address = None 12:15:32 socket_options = [(6, 1, 1)] 12:15:32 12:15:32 def create_connection( 12:15:32 address: tuple[str, int], 12:15:32 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:32 source_address: tuple[str, int] | None = None, 12:15:32 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:15:32 ) -> socket.socket: 12:15:32 """Connect to *address* and return the socket object. 12:15:32 12:15:32 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:15:32 port)``) and return the socket object. Passing the optional 12:15:32 *timeout* parameter will set the timeout on the socket instance 12:15:32 before attempting to connect. If no *timeout* is supplied, the 12:15:32 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:15:32 is used. If *source_address* is set it must be a tuple of (host, port) 12:15:32 for the socket to bind as a source address before making the connection. 12:15:32 An host of '' or port 0 tells the OS to use the default. 12:15:32 """ 12:15:32 12:15:32 host, port = address 12:15:32 if host.startswith("["): 12:15:32 host = host.strip("[]") 12:15:32 err = None 12:15:32 12:15:32 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:15:32 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:15:32 # The original create_connection function always returns all records. 12:15:32 family = allowed_gai_family() 12:15:32 12:15:32 try: 12:15:32 host.encode("idna") 12:15:32 except UnicodeError: 12:15:32 raise LocationParseError(f"'{host}', label empty or too long") from None 12:15:32 12:15:32 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:15:32 af, socktype, proto, canonname, sa = res 12:15:32 sock = None 12:15:32 try: 12:15:32 sock = socket.socket(af, socktype, proto) 12:15:32 12:15:32 # If provided, set socket level options before connecting. 12:15:32 _set_socket_options(sock, socket_options) 12:15:32 12:15:32 if timeout is not _DEFAULT_TIMEOUT: 12:15:32 sock.settimeout(timeout) 12:15:32 if source_address: 12:15:32 sock.bind(source_address) 12:15:32 > sock.connect(sa) 12:15:32 E ConnectionRefusedError: [Errno 111] Connection refused 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:15:32 12:15:32 The above exception was the direct cause of the following exception: 12:15:32 12:15:32 self = 12:15:32 method = 'GET' 12:15:32 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT4' 12:15:32 body = None 12:15:32 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:15:32 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:32 redirect = False, assert_same_host = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 12:15:32 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:15:32 decode_content = False, response_kw = {} 12:15:32 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT4', query=None, fragment=None) 12:15:32 destination_scheme = None, conn = None, release_this_conn = True 12:15:32 http_tunnel_required = False, err = None, clean_exit = False 12:15:32 12:15:32 def urlopen( # type: ignore[override] 12:15:32 self, 12:15:32 method: str, 12:15:32 url: str, 12:15:32 body: _TYPE_BODY | None = None, 12:15:32 headers: typing.Mapping[str, str] | None = None, 12:15:32 retries: Retry | bool | int | None = None, 12:15:32 redirect: bool = True, 12:15:32 assert_same_host: bool = True, 12:15:32 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:32 pool_timeout: int | None = None, 12:15:32 release_conn: bool | None = None, 12:15:32 chunked: bool = False, 12:15:32 body_pos: _TYPE_BODY_POSITION | None = None, 12:15:32 preload_content: bool = True, 12:15:32 decode_content: bool = True, 12:15:32 **response_kw: typing.Any, 12:15:32 ) -> BaseHTTPResponse: 12:15:32 """ 12:15:32 Get a connection from the pool and perform an HTTP request. This is the 12:15:32 lowest level call for making a request, so you'll need to specify all 12:15:32 the raw details. 12:15:32 12:15:32 .. note:: 12:15:32 12:15:32 More commonly, it's appropriate to use a convenience method 12:15:32 such as :meth:`request`. 12:15:32 12:15:32 .. note:: 12:15:32 12:15:32 `release_conn` will only behave as expected if 12:15:32 `preload_content=False` because we want to make 12:15:32 `preload_content=False` the default behaviour someday soon without 12:15:32 breaking backwards compatibility. 12:15:32 12:15:32 :param method: 12:15:32 HTTP request method (such as GET, POST, PUT, etc.) 12:15:32 12:15:32 :param url: 12:15:32 The URL to perform the request on. 12:15:32 12:15:32 :param body: 12:15:32 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:15:32 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:15:32 12:15:32 :param headers: 12:15:32 Dictionary of custom headers to send, such as User-Agent, 12:15:32 If-None-Match, etc. If None, pool headers are used. If provided, 12:15:32 these headers completely replace any pool-specific headers. 12:15:32 12:15:32 :param retries: 12:15:32 Configure the number of retries to allow before raising a 12:15:32 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:15:32 12:15:32 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:15:32 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:15:32 over different types of retries. 12:15:32 Pass an integer number to retry connection errors that many times, 12:15:32 but no other types of errors. Pass zero to never retry. 12:15:32 12:15:32 If ``False``, then retries are disabled and any exception is raised 12:15:32 immediately. Also, instead of raising a MaxRetryError on redirects, 12:15:32 the redirect response will be returned. 12:15:32 12:15:32 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:15:32 12:15:32 :param redirect: 12:15:32 If True, automatically handle redirects (status codes 301, 302, 12:15:32 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:15:32 will disable redirect, too. 12:15:32 12:15:32 :param assert_same_host: 12:15:32 If ``True``, will make sure that the host of the pool requests is 12:15:32 consistent else will raise HostChangedError. When ``False``, you can 12:15:32 use the pool on an HTTP proxy and request foreign hosts. 12:15:32 12:15:32 :param timeout: 12:15:32 If specified, overrides the default timeout for this one 12:15:32 request. It may be a float (in seconds) or an instance of 12:15:32 :class:`urllib3.util.Timeout`. 12:15:32 12:15:32 :param pool_timeout: 12:15:32 If set and the pool is set to block=True, then this method will 12:15:32 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:15:32 connection is available within the time period. 12:15:32 12:15:32 :param bool preload_content: 12:15:32 If True, the response's body will be preloaded into memory. 12:15:32 12:15:32 :param bool decode_content: 12:15:32 If True, will attempt to decode the body based on the 12:15:32 'content-encoding' header. 12:15:32 12:15:32 :param release_conn: 12:15:32 If False, then the urlopen call will not release the connection 12:15:32 back into the pool once a response is received (but will release if 12:15:32 you read the entire contents of the response such as when 12:15:32 `preload_content=True`). This is useful if you're not preloading 12:15:32 the response's content immediately. You will need to call 12:15:32 ``r.release_conn()`` on the response ``r`` to return the connection 12:15:32 back into the pool. If None, it takes the value of ``preload_content`` 12:15:32 which defaults to ``True``. 12:15:32 12:15:32 :param bool chunked: 12:15:32 If True, urllib3 will send the body using chunked transfer 12:15:32 encoding. Otherwise, urllib3 will send the body using the standard 12:15:32 content-length form. Defaults to False. 12:15:32 12:15:32 :param int body_pos: 12:15:32 Position to seek to in file-like body in the event of a retry or 12:15:32 redirect. Typically this won't need to be set because urllib3 will 12:15:32 auto-populate the value when needed. 12:15:32 """ 12:15:32 parsed_url = parse_url(url) 12:15:32 destination_scheme = parsed_url.scheme 12:15:32 12:15:32 if headers is None: 12:15:32 headers = self.headers 12:15:32 12:15:32 if not isinstance(retries, Retry): 12:15:32 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:15:32 12:15:32 if release_conn is None: 12:15:32 release_conn = preload_content 12:15:32 12:15:32 # Check host 12:15:32 if assert_same_host and not self.is_same_host(url): 12:15:32 raise HostChangedError(self, url, retries) 12:15:32 12:15:32 # Ensure that the URL we're connecting to is properly encoded 12:15:32 if url.startswith("/"): 12:15:32 url = to_str(_encode_target(url)) 12:15:32 else: 12:15:32 url = to_str(parsed_url.url) 12:15:32 12:15:32 conn = None 12:15:32 12:15:32 # Track whether `conn` needs to be released before 12:15:32 # returning/raising/recursing. Update this variable if necessary, and 12:15:32 # leave `release_conn` constant throughout the function. That way, if 12:15:32 # the function recurses, the original value of `release_conn` will be 12:15:32 # passed down into the recursive call, and its value will be respected. 12:15:32 # 12:15:32 # See issue #651 [1] for details. 12:15:32 # 12:15:32 # [1] 12:15:32 release_this_conn = release_conn 12:15:32 12:15:32 http_tunnel_required = connection_requires_http_tunnel( 12:15:32 self.proxy, self.proxy_config, destination_scheme 12:15:32 ) 12:15:32 12:15:32 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:15:32 # have to copy the headers dict so we can safely change it without those 12:15:32 # changes being reflected in anyone else's copy. 12:15:32 if not http_tunnel_required: 12:15:32 headers = headers.copy() # type: ignore[attr-defined] 12:15:32 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:15:32 12:15:32 # Must keep the exception bound to a separate variable or else Python 3 12:15:32 # complains about UnboundLocalError. 12:15:32 err = None 12:15:32 12:15:32 # Keep track of whether we cleanly exited the except block. This 12:15:32 # ensures we do proper cleanup in finally. 12:15:32 clean_exit = False 12:15:32 12:15:32 # Rewind body position, if needed. Record current position 12:15:32 # for future rewinds in the event of a redirect/retry. 12:15:32 body_pos = set_file_position(body, body_pos) 12:15:32 12:15:32 try: 12:15:32 # Request a connection from the queue. 12:15:32 timeout_obj = self._get_timeout(timeout) 12:15:32 conn = self._get_conn(timeout=pool_timeout) 12:15:32 12:15:32 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:15:32 12:15:32 # Is this a closed/new connection that requires CONNECT tunnelling? 12:15:32 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:15:32 try: 12:15:32 self._prepare_proxy(conn) 12:15:32 except (BaseSSLError, OSError, SocketTimeout) as e: 12:15:32 self._raise_timeout( 12:15:32 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:15:32 ) 12:15:32 raise 12:15:32 12:15:32 # If we're going to release the connection in ``finally:``, then 12:15:32 # the response doesn't need to know about the connection. Otherwise 12:15:32 # it will also try to release it and we'll have a double-release 12:15:32 # mess. 12:15:32 response_conn = conn if not release_conn else None 12:15:32 12:15:32 # Make the request on the HTTPConnection object 12:15:32 > response = self._make_request( 12:15:32 conn, 12:15:32 method, 12:15:32 url, 12:15:32 timeout=timeout_obj, 12:15:32 body=body, 12:15:32 headers=headers, 12:15:32 chunked=chunked, 12:15:32 retries=retries, 12:15:32 response_conn=response_conn, 12:15:32 preload_content=preload_content, 12:15:32 decode_content=decode_content, 12:15:32 **response_kw, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 12:15:32 conn.request( 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 12:15:32 self.endheaders() 12:15:32 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 12:15:32 self._send_output(message_body, encode_chunked=encode_chunked) 12:15:32 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 12:15:32 self.send(msg) 12:15:32 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 12:15:32 self.connect() 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 12:15:32 self.sock = self._new_conn() 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 self = 12:15:32 12:15:32 def _new_conn(self) -> socket.socket: 12:15:32 """Establish a socket connection and set nodelay settings on it. 12:15:32 12:15:32 :return: New socket connection. 12:15:32 """ 12:15:32 try: 12:15:32 sock = connection.create_connection( 12:15:32 (self._dns_host, self.port), 12:15:32 self.timeout, 12:15:32 source_address=self.source_address, 12:15:32 socket_options=self.socket_options, 12:15:32 ) 12:15:32 except socket.gaierror as e: 12:15:32 raise NameResolutionError(self.host, self, e) from e 12:15:32 except SocketTimeout as e: 12:15:32 raise ConnectTimeoutError( 12:15:32 self, 12:15:32 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:15:32 ) from e 12:15:32 12:15:32 except OSError as e: 12:15:32 > raise NewConnectionError( 12:15:32 self, f"Failed to establish a new connection: {e}" 12:15:32 ) from e 12:15:32 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 12:15:32 12:15:32 The above exception was the direct cause of the following exception: 12:15:32 12:15:32 self = 12:15:32 request = , stream = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:32 proxies = OrderedDict() 12:15:32 12:15:32 def send( 12:15:32 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:32 ): 12:15:32 """Sends PreparedRequest object. Returns Response object. 12:15:32 12:15:32 :param request: The :class:`PreparedRequest ` being sent. 12:15:32 :param stream: (optional) Whether to stream the request content. 12:15:32 :param timeout: (optional) How long to wait for the server to send 12:15:32 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:32 read timeout) ` tuple. 12:15:32 :type timeout: float or tuple or urllib3 Timeout object 12:15:32 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:32 we verify the server's TLS certificate, or a string, in which case it 12:15:32 must be a path to a CA bundle to use 12:15:32 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:32 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:32 :rtype: requests.Response 12:15:32 """ 12:15:32 12:15:32 try: 12:15:32 conn = self.get_connection_with_tls_context( 12:15:32 request, verify, proxies=proxies, cert=cert 12:15:32 ) 12:15:32 except LocationValueError as e: 12:15:32 raise InvalidURL(e, request=request) 12:15:32 12:15:32 self.cert_verify(conn, request.url, verify, cert) 12:15:32 url = self.request_url(request, proxies) 12:15:32 self.add_headers( 12:15:32 request, 12:15:32 stream=stream, 12:15:32 timeout=timeout, 12:15:32 verify=verify, 12:15:32 cert=cert, 12:15:32 proxies=proxies, 12:15:32 ) 12:15:32 12:15:32 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:32 12:15:32 if isinstance(timeout, tuple): 12:15:32 try: 12:15:32 connect, read = timeout 12:15:32 timeout = TimeoutSauce(connect=connect, read=read) 12:15:32 except ValueError: 12:15:32 raise ValueError( 12:15:32 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:32 f"or a single float to set both timeouts to the same value." 12:15:32 ) 12:15:32 elif isinstance(timeout, TimeoutSauce): 12:15:32 pass 12:15:32 else: 12:15:32 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:32 12:15:32 try: 12:15:32 > resp = conn.urlopen( 12:15:32 method=request.method, 12:15:32 url=url, 12:15:32 body=request.body, 12:15:32 headers=request.headers, 12:15:32 redirect=False, 12:15:32 assert_same_host=False, 12:15:32 preload_content=False, 12:15:32 decode_content=False, 12:15:32 retries=self.max_retries, 12:15:32 timeout=timeout, 12:15:32 chunked=chunked, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 12:15:32 retries = retries.increment( 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:32 method = 'GET' 12:15:32 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT4' 12:15:32 response = None 12:15:32 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 12:15:32 _pool = 12:15:32 _stacktrace = 12:15:32 12:15:32 def increment( 12:15:32 self, 12:15:32 method: str | None = None, 12:15:32 url: str | None = None, 12:15:32 response: BaseHTTPResponse | None = None, 12:15:32 error: Exception | None = None, 12:15:32 _pool: ConnectionPool | None = None, 12:15:32 _stacktrace: TracebackType | None = None, 12:15:32 ) -> Self: 12:15:32 """Return a new Retry object with incremented retry counters. 12:15:32 12:15:32 :param response: A response object, or None, if the server did not 12:15:32 return a response. 12:15:32 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:15:32 :param Exception error: An error encountered during the request, or 12:15:32 None if the response was received successfully. 12:15:32 12:15:32 :return: A new ``Retry`` object. 12:15:32 """ 12:15:32 if self.total is False and error: 12:15:32 # Disabled, indicate to re-raise the error. 12:15:32 raise reraise(type(error), error, _stacktrace) 12:15:32 12:15:32 total = self.total 12:15:32 if total is not None: 12:15:32 total -= 1 12:15:32 12:15:32 connect = self.connect 12:15:32 read = self.read 12:15:32 redirect = self.redirect 12:15:32 status_count = self.status 12:15:32 other = self.other 12:15:32 cause = "unknown" 12:15:32 status = None 12:15:32 redirect_location = None 12:15:32 12:15:32 if error and self._is_connection_error(error): 12:15:32 # Connect retry? 12:15:32 if connect is False: 12:15:32 raise reraise(type(error), error, _stacktrace) 12:15:32 elif connect is not None: 12:15:32 connect -= 1 12:15:32 12:15:32 elif error and self._is_read_error(error): 12:15:32 # Read retry? 12:15:32 if read is False or method is None or not self._is_method_retryable(method): 12:15:32 raise reraise(type(error), error, _stacktrace) 12:15:32 elif read is not None: 12:15:32 read -= 1 12:15:32 12:15:32 elif error: 12:15:32 # Other retry? 12:15:32 if other is not None: 12:15:32 other -= 1 12:15:32 12:15:32 elif response and response.get_redirect_location(): 12:15:32 # Redirect retry? 12:15:32 if redirect is not None: 12:15:32 redirect -= 1 12:15:32 cause = "too many redirects" 12:15:32 response_redirect_location = response.get_redirect_location() 12:15:32 if response_redirect_location: 12:15:32 redirect_location = response_redirect_location 12:15:32 status = response.status 12:15:32 12:15:32 else: 12:15:32 # Incrementing because of a server error like a 500 in 12:15:32 # status_forcelist and the given method is in the allowed_methods 12:15:32 cause = ResponseError.GENERIC_ERROR 12:15:32 if response and response.status: 12:15:32 if status_count is not None: 12:15:32 status_count -= 1 12:15:32 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:15:32 status = response.status 12:15:32 12:15:32 history = self.history + ( 12:15:32 RequestHistory(method, url, error, status, redirect_location), 12:15:32 ) 12:15:32 12:15:32 new_retry = self.new( 12:15:32 total=total, 12:15:32 connect=connect, 12:15:32 read=read, 12:15:32 redirect=redirect, 12:15:32 status=status_count, 12:15:32 other=other, 12:15:32 history=history, 12:15:32 ) 12:15:32 12:15:32 if new_retry.is_exhausted(): 12:15:32 reason = error or ResponseError(cause) 12:15:32 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:15:32 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT4 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 12:15:32 12:15:32 During handling of the above exception, another exception occurred: 12:15:32 12:15:32 self = 12:15:32 12:15:32 def test_15_xpdr_portmapping_CLIENT4(self): 12:15:32 > response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-CLIENT4") 12:15:32 12:15:32 transportpce_tests/1.2.1/test01_portmapping.py:180: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 transportpce_tests/common/test_utils.py:470: in get_portmapping_node_attr 12:15:32 response = get_request(target_url) 12:15:32 transportpce_tests/common/test_utils.py:116: in get_request 12:15:32 return requests.request( 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:15:32 return session.request(method=method, url=url, **kwargs) 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:15:32 resp = self.send(prep, **send_kwargs) 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:15:32 r = adapter.send(request, **kwargs) 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 self = 12:15:32 request = , stream = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:32 proxies = OrderedDict() 12:15:32 12:15:32 def send( 12:15:32 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:32 ): 12:15:32 """Sends PreparedRequest object. Returns Response object. 12:15:32 12:15:32 :param request: The :class:`PreparedRequest ` being sent. 12:15:32 :param stream: (optional) Whether to stream the request content. 12:15:32 :param timeout: (optional) How long to wait for the server to send 12:15:32 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:32 read timeout) ` tuple. 12:15:32 :type timeout: float or tuple or urllib3 Timeout object 12:15:32 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:32 we verify the server's TLS certificate, or a string, in which case it 12:15:32 must be a path to a CA bundle to use 12:15:32 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:32 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:32 :rtype: requests.Response 12:15:32 """ 12:15:32 12:15:32 try: 12:15:32 conn = self.get_connection_with_tls_context( 12:15:32 request, verify, proxies=proxies, cert=cert 12:15:32 ) 12:15:32 except LocationValueError as e: 12:15:32 raise InvalidURL(e, request=request) 12:15:32 12:15:32 self.cert_verify(conn, request.url, verify, cert) 12:15:32 url = self.request_url(request, proxies) 12:15:32 self.add_headers( 12:15:32 request, 12:15:32 stream=stream, 12:15:32 timeout=timeout, 12:15:32 verify=verify, 12:15:32 cert=cert, 12:15:32 proxies=proxies, 12:15:32 ) 12:15:32 12:15:32 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:32 12:15:32 if isinstance(timeout, tuple): 12:15:32 try: 12:15:32 connect, read = timeout 12:15:32 timeout = TimeoutSauce(connect=connect, read=read) 12:15:32 except ValueError: 12:15:32 raise ValueError( 12:15:32 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:32 f"or a single float to set both timeouts to the same value." 12:15:32 ) 12:15:32 elif isinstance(timeout, TimeoutSauce): 12:15:32 pass 12:15:32 else: 12:15:32 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:32 12:15:32 try: 12:15:32 resp = conn.urlopen( 12:15:32 method=request.method, 12:15:32 url=url, 12:15:32 body=request.body, 12:15:32 headers=request.headers, 12:15:32 redirect=False, 12:15:32 assert_same_host=False, 12:15:32 preload_content=False, 12:15:32 decode_content=False, 12:15:32 retries=self.max_retries, 12:15:32 timeout=timeout, 12:15:32 chunked=chunked, 12:15:32 ) 12:15:32 12:15:32 except (ProtocolError, OSError) as err: 12:15:32 raise ConnectionError(err, request=request) 12:15:32 12:15:32 except MaxRetryError as e: 12:15:32 if isinstance(e.reason, ConnectTimeoutError): 12:15:32 # TODO: Remove this in 3.0.0: see #2811 12:15:32 if not isinstance(e.reason, NewConnectionError): 12:15:32 raise ConnectTimeout(e, request=request) 12:15:32 12:15:32 if isinstance(e.reason, ResponseError): 12:15:32 raise RetryError(e, request=request) 12:15:32 12:15:32 if isinstance(e.reason, _ProxyError): 12:15:32 raise ProxyError(e, request=request) 12:15:32 12:15:32 if isinstance(e.reason, _SSLError): 12:15:32 # This branch is for urllib3 v1.22 and later. 12:15:32 raise SSLError(e, request=request) 12:15:32 12:15:32 > raise ConnectionError(e, request=request) 12:15:32 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT4 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 12:15:32 ----------------------------- Captured stdout call ----------------------------- 12:15:32 execution of test_15_xpdr_portmapping_CLIENT4 12:15:32 _______ TransportPCEPortMappingTesting.test_16_xpdr_device_disconnection _______ 12:15:32 12:15:32 self = 12:15:32 12:15:32 def _new_conn(self) -> socket.socket: 12:15:32 """Establish a socket connection and set nodelay settings on it. 12:15:32 12:15:32 :return: New socket connection. 12:15:32 """ 12:15:32 try: 12:15:32 > sock = connection.create_connection( 12:15:32 (self._dns_host, self.port), 12:15:32 self.timeout, 12:15:32 source_address=self.source_address, 12:15:32 socket_options=self.socket_options, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:15:32 raise err 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:32 12:15:32 address = ('localhost', 8182), timeout = 10, source_address = None 12:15:32 socket_options = [(6, 1, 1)] 12:15:32 12:15:32 def create_connection( 12:15:32 address: tuple[str, int], 12:15:32 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:32 source_address: tuple[str, int] | None = None, 12:15:32 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:15:32 ) -> socket.socket: 12:15:32 """Connect to *address* and return the socket object. 12:15:32 12:15:32 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:15:32 port)``) and return the socket object. Passing the optional 12:15:32 *timeout* parameter will set the timeout on the socket instance 12:15:32 before attempting to connect. If no *timeout* is supplied, the 12:15:32 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:15:32 is used. If *source_address* is set it must be a tuple of (host, port) 12:15:32 for the socket to bind as a source address before making the connection. 12:15:32 An host of '' or port 0 tells the OS to use the default. 12:15:32 """ 12:15:32 12:15:32 host, port = address 12:15:32 if host.startswith("["): 12:15:32 host = host.strip("[]") 12:15:32 err = None 12:15:32 12:15:32 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:15:32 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:15:32 # The original create_connection function always returns all records. 12:15:32 family = allowed_gai_family() 12:15:32 12:15:32 try: 12:15:32 host.encode("idna") 12:15:32 except UnicodeError: 12:15:32 raise LocationParseError(f"'{host}', label empty or too long") from None 12:15:32 12:15:32 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:15:32 af, socktype, proto, canonname, sa = res 12:15:32 sock = None 12:15:32 try: 12:15:32 sock = socket.socket(af, socktype, proto) 12:15:32 12:15:32 # If provided, set socket level options before connecting. 12:15:32 _set_socket_options(sock, socket_options) 12:15:32 12:15:32 if timeout is not _DEFAULT_TIMEOUT: 12:15:32 sock.settimeout(timeout) 12:15:32 if source_address: 12:15:32 sock.bind(source_address) 12:15:32 > sock.connect(sa) 12:15:32 E ConnectionRefusedError: [Errno 111] Connection refused 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:15:32 12:15:32 The above exception was the direct cause of the following exception: 12:15:32 12:15:32 self = 12:15:32 method = 'DELETE' 12:15:32 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01' 12:15:32 body = None 12:15:32 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '0', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:15:32 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:32 redirect = False, assert_same_host = False 12:15:32 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 12:15:32 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:15:32 decode_content = False, response_kw = {} 12:15:32 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01', query=None, fragment=None) 12:15:32 destination_scheme = None, conn = None, release_this_conn = True 12:15:32 http_tunnel_required = False, err = None, clean_exit = False 12:15:32 12:15:32 def urlopen( # type: ignore[override] 12:15:32 self, 12:15:32 method: str, 12:15:32 url: str, 12:15:32 body: _TYPE_BODY | None = None, 12:15:32 headers: typing.Mapping[str, str] | None = None, 12:15:32 retries: Retry | bool | int | None = None, 12:15:32 redirect: bool = True, 12:15:32 assert_same_host: bool = True, 12:15:32 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:32 pool_timeout: int | None = None, 12:15:32 release_conn: bool | None = None, 12:15:32 chunked: bool = False, 12:15:32 body_pos: _TYPE_BODY_POSITION | None = None, 12:15:32 preload_content: bool = True, 12:15:32 decode_content: bool = True, 12:15:32 **response_kw: typing.Any, 12:15:32 ) -> BaseHTTPResponse: 12:15:32 """ 12:15:32 Get a connection from the pool and perform an HTTP request. This is the 12:15:32 lowest level call for making a request, so you'll need to specify all 12:15:32 the raw details. 12:15:32 12:15:32 .. note:: 12:15:32 12:15:32 More commonly, it's appropriate to use a convenience method 12:15:32 such as :meth:`request`. 12:15:32 12:15:32 .. note:: 12:15:32 12:15:32 `release_conn` will only behave as expected if 12:15:32 `preload_content=False` because we want to make 12:15:32 `preload_content=False` the default behaviour someday soon without 12:15:32 breaking backwards compatibility. 12:15:32 12:15:32 :param method: 12:15:32 HTTP request method (such as GET, POST, PUT, etc.) 12:15:32 12:15:32 :param url: 12:15:32 The URL to perform the request on. 12:15:32 12:15:32 :param body: 12:15:32 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:15:32 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:15:32 12:15:32 :param headers: 12:15:32 Dictionary of custom headers to send, such as User-Agent, 12:15:32 If-None-Match, etc. If None, pool headers are used. If provided, 12:15:32 these headers completely replace any pool-specific headers. 12:15:32 12:15:32 :param retries: 12:15:32 Configure the number of retries to allow before raising a 12:15:32 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:15:32 12:15:32 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:15:32 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:15:32 over different types of retries. 12:15:32 Pass an integer number to retry connection errors that many times, 12:15:32 but no other types of errors. Pass zero to never retry. 12:15:32 12:15:32 If ``False``, then retries are disabled and any exception is raised 12:15:32 immediately. Also, instead of raising a MaxRetryError on redirects, 12:15:32 the redirect response will be returned. 12:15:32 12:15:32 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:15:32 12:15:32 :param redirect: 12:15:32 If True, automatically handle redirects (status codes 301, 302, 12:15:32 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:15:32 will disable redirect, too. 12:15:32 12:15:32 :param assert_same_host: 12:15:32 If ``True``, will make sure that the host of the pool requests is 12:15:32 consistent else will raise HostChangedError. When ``False``, you can 12:15:32 use the pool on an HTTP proxy and request foreign hosts. 12:15:32 12:15:32 :param timeout: 12:15:32 If specified, overrides the default timeout for this one 12:15:32 request. It may be a float (in seconds) or an instance of 12:15:32 :class:`urllib3.util.Timeout`. 12:15:32 12:15:32 :param pool_timeout: 12:15:32 If set and the pool is set to block=True, then this method will 12:15:32 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:15:32 connection is available within the time period. 12:15:32 12:15:32 :param bool preload_content: 12:15:32 If True, the response's body will be preloaded into memory. 12:15:32 12:15:32 :param bool decode_content: 12:15:32 If True, will attempt to decode the body based on the 12:15:32 'content-encoding' header. 12:15:32 12:15:32 :param release_conn: 12:15:32 If False, then the urlopen call will not release the connection 12:15:32 back into the pool once a response is received (but will release if 12:15:32 you read the entire contents of the response such as when 12:15:32 `preload_content=True`). This is useful if you're not preloading 12:15:32 the response's content immediately. You will need to call 12:15:32 ``r.release_conn()`` on the response ``r`` to return the connection 12:15:32 back into the pool. If None, it takes the value of ``preload_content`` 12:15:32 which defaults to ``True``. 12:15:32 12:15:32 :param bool chunked: 12:15:32 If True, urllib3 will send the body using chunked transfer 12:15:32 encoding. Otherwise, urllib3 will send the body using the standard 12:15:32 content-length form. Defaults to False. 12:15:32 12:15:32 :param int body_pos: 12:15:32 Position to seek to in file-like body in the event of a retry or 12:15:32 redirect. Typically this won't need to be set because urllib3 will 12:15:32 auto-populate the value when needed. 12:15:32 """ 12:15:32 parsed_url = parse_url(url) 12:15:32 destination_scheme = parsed_url.scheme 12:15:32 12:15:32 if headers is None: 12:15:32 headers = self.headers 12:15:32 12:15:32 if not isinstance(retries, Retry): 12:15:32 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:15:32 12:15:32 if release_conn is None: 12:15:32 release_conn = preload_content 12:15:32 12:15:32 # Check host 12:15:32 if assert_same_host and not self.is_same_host(url): 12:15:32 raise HostChangedError(self, url, retries) 12:15:32 12:15:32 # Ensure that the URL we're connecting to is properly encoded 12:15:32 if url.startswith("/"): 12:15:32 url = to_str(_encode_target(url)) 12:15:32 else: 12:15:32 url = to_str(parsed_url.url) 12:15:32 12:15:32 conn = None 12:15:32 12:15:32 # Track whether `conn` needs to be released before 12:15:32 # returning/raising/recursing. Update this variable if necessary, and 12:15:32 # leave `release_conn` constant throughout the function. That way, if 12:15:32 # the function recurses, the original value of `release_conn` will be 12:15:32 # passed down into the recursive call, and its value will be respected. 12:15:32 # 12:15:32 # See issue #651 [1] for details. 12:15:32 # 12:15:32 # [1] 12:15:32 release_this_conn = release_conn 12:15:32 12:15:32 http_tunnel_required = connection_requires_http_tunnel( 12:15:32 self.proxy, self.proxy_config, destination_scheme 12:15:32 ) 12:15:32 12:15:32 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:15:32 # have to copy the headers dict so we can safely change it without those 12:15:32 # changes being reflected in anyone else's copy. 12:15:32 if not http_tunnel_required: 12:15:32 headers = headers.copy() # type: ignore[attr-defined] 12:15:32 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:15:32 12:15:32 # Must keep the exception bound to a separate variable or else Python 3 12:15:32 # complains about UnboundLocalError. 12:15:32 err = None 12:15:32 12:15:32 # Keep track of whether we cleanly exited the except block. This 12:15:32 # ensures we do proper cleanup in finally. 12:15:32 clean_exit = False 12:15:32 12:15:32 # Rewind body position, if needed. Record current position 12:15:32 # for future rewinds in the event of a redirect/retry. 12:15:32 body_pos = set_file_position(body, body_pos) 12:15:32 12:15:32 try: 12:15:32 # Request a connection from the queue. 12:15:32 timeout_obj = self._get_timeout(timeout) 12:15:32 conn = self._get_conn(timeout=pool_timeout) 12:15:32 12:15:32 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:15:32 12:15:32 # Is this a closed/new connection that requires CONNECT tunnelling? 12:15:32 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:15:32 try: 12:15:32 self._prepare_proxy(conn) 12:15:32 except (BaseSSLError, OSError, SocketTimeout) as e: 12:15:32 self._raise_timeout( 12:15:32 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:15:32 ) 12:15:32 raise 12:15:32 12:15:32 # If we're going to release the connection in ``finally:``, then 12:15:32 # the response doesn't need to know about the connection. Otherwise 12:15:32 # it will also try to release it and we'll have a double-release 12:15:32 # mess. 12:15:32 response_conn = conn if not release_conn else None 12:15:32 12:15:32 # Make the request on the HTTPConnection object 12:15:32 > response = self._make_request( 12:15:32 conn, 12:15:32 method, 12:15:32 url, 12:15:32 timeout=timeout_obj, 12:15:32 body=body, 12:15:32 headers=headers, 12:15:32 chunked=chunked, 12:15:32 retries=retries, 12:15:32 response_conn=response_conn, 12:15:32 preload_content=preload_content, 12:15:32 decode_content=decode_content, 12:15:32 **response_kw, 12:15:32 ) 12:15:32 12:15:32 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 12:15:32 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 12:15:33 conn.request( 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 12:15:33 self.endheaders() 12:15:33 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 12:15:33 self._send_output(message_body, encode_chunked=encode_chunked) 12:15:33 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 12:15:33 self.send(msg) 12:15:33 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 12:15:33 self.connect() 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 12:15:33 self.sock = self._new_conn() 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 self = 12:15:33 12:15:33 def _new_conn(self) -> socket.socket: 12:15:33 """Establish a socket connection and set nodelay settings on it. 12:15:33 12:15:33 :return: New socket connection. 12:15:33 """ 12:15:33 try: 12:15:33 sock = connection.create_connection( 12:15:33 (self._dns_host, self.port), 12:15:33 self.timeout, 12:15:33 source_address=self.source_address, 12:15:33 socket_options=self.socket_options, 12:15:33 ) 12:15:33 except socket.gaierror as e: 12:15:33 raise NameResolutionError(self.host, self, e) from e 12:15:33 except SocketTimeout as e: 12:15:33 raise ConnectTimeoutError( 12:15:33 self, 12:15:33 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:15:33 ) from e 12:15:33 12:15:33 except OSError as e: 12:15:33 > raise NewConnectionError( 12:15:33 self, f"Failed to establish a new connection: {e}" 12:15:33 ) from e 12:15:33 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 12:15:33 12:15:33 The above exception was the direct cause of the following exception: 12:15:33 12:15:33 self = 12:15:33 request = , stream = False 12:15:33 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:33 proxies = OrderedDict() 12:15:33 12:15:33 def send( 12:15:33 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:33 ): 12:15:33 """Sends PreparedRequest object. Returns Response object. 12:15:33 12:15:33 :param request: The :class:`PreparedRequest ` being sent. 12:15:33 :param stream: (optional) Whether to stream the request content. 12:15:33 :param timeout: (optional) How long to wait for the server to send 12:15:33 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:33 read timeout) ` tuple. 12:15:33 :type timeout: float or tuple or urllib3 Timeout object 12:15:33 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:33 we verify the server's TLS certificate, or a string, in which case it 12:15:33 must be a path to a CA bundle to use 12:15:33 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:33 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:33 :rtype: requests.Response 12:15:33 """ 12:15:33 12:15:33 try: 12:15:33 conn = self.get_connection_with_tls_context( 12:15:33 request, verify, proxies=proxies, cert=cert 12:15:33 ) 12:15:33 except LocationValueError as e: 12:15:33 raise InvalidURL(e, request=request) 12:15:33 12:15:33 self.cert_verify(conn, request.url, verify, cert) 12:15:33 url = self.request_url(request, proxies) 12:15:33 self.add_headers( 12:15:33 request, 12:15:33 stream=stream, 12:15:33 timeout=timeout, 12:15:33 verify=verify, 12:15:33 cert=cert, 12:15:33 proxies=proxies, 12:15:33 ) 12:15:33 12:15:33 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:33 12:15:33 if isinstance(timeout, tuple): 12:15:33 try: 12:15:33 connect, read = timeout 12:15:33 timeout = TimeoutSauce(connect=connect, read=read) 12:15:33 except ValueError: 12:15:33 raise ValueError( 12:15:33 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:33 f"or a single float to set both timeouts to the same value." 12:15:33 ) 12:15:33 elif isinstance(timeout, TimeoutSauce): 12:15:33 pass 12:15:33 else: 12:15:33 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:33 12:15:33 try: 12:15:33 > resp = conn.urlopen( 12:15:33 method=request.method, 12:15:33 url=url, 12:15:33 body=request.body, 12:15:33 headers=request.headers, 12:15:33 redirect=False, 12:15:33 assert_same_host=False, 12:15:33 preload_content=False, 12:15:33 decode_content=False, 12:15:33 retries=self.max_retries, 12:15:33 timeout=timeout, 12:15:33 chunked=chunked, 12:15:33 ) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 12:15:33 retries = retries.increment( 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:33 method = 'DELETE' 12:15:33 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01' 12:15:33 response = None 12:15:33 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 12:15:33 _pool = 12:15:33 _stacktrace = 12:15:33 12:15:33 def increment( 12:15:33 self, 12:15:33 method: str | None = None, 12:15:33 url: str | None = None, 12:15:33 response: BaseHTTPResponse | None = None, 12:15:33 error: Exception | None = None, 12:15:33 _pool: ConnectionPool | None = None, 12:15:33 _stacktrace: TracebackType | None = None, 12:15:33 ) -> Self: 12:15:33 """Return a new Retry object with incremented retry counters. 12:15:33 12:15:33 :param response: A response object, or None, if the server did not 12:15:33 return a response. 12:15:33 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:15:33 :param Exception error: An error encountered during the request, or 12:15:33 None if the response was received successfully. 12:15:33 12:15:33 :return: A new ``Retry`` object. 12:15:33 """ 12:15:33 if self.total is False and error: 12:15:33 # Disabled, indicate to re-raise the error. 12:15:33 raise reraise(type(error), error, _stacktrace) 12:15:33 12:15:33 total = self.total 12:15:33 if total is not None: 12:15:33 total -= 1 12:15:33 12:15:33 connect = self.connect 12:15:33 read = self.read 12:15:33 redirect = self.redirect 12:15:33 status_count = self.status 12:15:33 other = self.other 12:15:33 cause = "unknown" 12:15:33 status = None 12:15:33 redirect_location = None 12:15:33 12:15:33 if error and self._is_connection_error(error): 12:15:33 # Connect retry? 12:15:33 if connect is False: 12:15:33 raise reraise(type(error), error, _stacktrace) 12:15:33 elif connect is not None: 12:15:33 connect -= 1 12:15:33 12:15:33 elif error and self._is_read_error(error): 12:15:33 # Read retry? 12:15:33 if read is False or method is None or not self._is_method_retryable(method): 12:15:33 raise reraise(type(error), error, _stacktrace) 12:15:33 elif read is not None: 12:15:33 read -= 1 12:15:33 12:15:33 elif error: 12:15:33 # Other retry? 12:15:33 if other is not None: 12:15:33 other -= 1 12:15:33 12:15:33 elif response and response.get_redirect_location(): 12:15:33 # Redirect retry? 12:15:33 if redirect is not None: 12:15:33 redirect -= 1 12:15:33 cause = "too many redirects" 12:15:33 response_redirect_location = response.get_redirect_location() 12:15:33 if response_redirect_location: 12:15:33 redirect_location = response_redirect_location 12:15:33 status = response.status 12:15:33 12:15:33 else: 12:15:33 # Incrementing because of a server error like a 500 in 12:15:33 # status_forcelist and the given method is in the allowed_methods 12:15:33 cause = ResponseError.GENERIC_ERROR 12:15:33 if response and response.status: 12:15:33 if status_count is not None: 12:15:33 status_count -= 1 12:15:33 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:15:33 status = response.status 12:15:33 12:15:33 history = self.history + ( 12:15:33 RequestHistory(method, url, error, status, redirect_location), 12:15:33 ) 12:15:33 12:15:33 new_retry = self.new( 12:15:33 total=total, 12:15:33 connect=connect, 12:15:33 read=read, 12:15:33 redirect=redirect, 12:15:33 status=status_count, 12:15:33 other=other, 12:15:33 history=history, 12:15:33 ) 12:15:33 12:15:33 if new_retry.is_exhausted(): 12:15:33 reason = error or ResponseError(cause) 12:15:33 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:15:33 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 12:15:33 12:15:33 During handling of the above exception, another exception occurred: 12:15:33 12:15:33 self = 12:15:33 12:15:33 def test_16_xpdr_device_disconnection(self): 12:15:33 > response = test_utils.unmount_device("XPDRA01") 12:15:33 12:15:33 transportpce_tests/1.2.1/test01_portmapping.py:191: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 transportpce_tests/common/test_utils.py:358: in unmount_device 12:15:33 response = delete_request(url[RESTCONF_VERSION].format('{}', node)) 12:15:33 transportpce_tests/common/test_utils.py:133: in delete_request 12:15:33 return requests.request( 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:15:33 return session.request(method=method, url=url, **kwargs) 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:15:33 resp = self.send(prep, **send_kwargs) 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:15:33 r = adapter.send(request, **kwargs) 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 self = 12:15:33 request = , stream = False 12:15:33 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:33 proxies = OrderedDict() 12:15:33 12:15:33 def send( 12:15:33 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:33 ): 12:15:33 """Sends PreparedRequest object. Returns Response object. 12:15:33 12:15:33 :param request: The :class:`PreparedRequest ` being sent. 12:15:33 :param stream: (optional) Whether to stream the request content. 12:15:33 :param timeout: (optional) How long to wait for the server to send 12:15:33 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:33 read timeout) ` tuple. 12:15:33 :type timeout: float or tuple or urllib3 Timeout object 12:15:33 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:33 we verify the server's TLS certificate, or a string, in which case it 12:15:33 must be a path to a CA bundle to use 12:15:33 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:33 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:33 :rtype: requests.Response 12:15:33 """ 12:15:33 12:15:33 try: 12:15:33 conn = self.get_connection_with_tls_context( 12:15:33 request, verify, proxies=proxies, cert=cert 12:15:33 ) 12:15:33 except LocationValueError as e: 12:15:33 raise InvalidURL(e, request=request) 12:15:33 12:15:33 self.cert_verify(conn, request.url, verify, cert) 12:15:33 url = self.request_url(request, proxies) 12:15:33 self.add_headers( 12:15:33 request, 12:15:33 stream=stream, 12:15:33 timeout=timeout, 12:15:33 verify=verify, 12:15:33 cert=cert, 12:15:33 proxies=proxies, 12:15:33 ) 12:15:33 12:15:33 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:33 12:15:33 if isinstance(timeout, tuple): 12:15:33 try: 12:15:33 connect, read = timeout 12:15:33 timeout = TimeoutSauce(connect=connect, read=read) 12:15:33 except ValueError: 12:15:33 raise ValueError( 12:15:33 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:33 f"or a single float to set both timeouts to the same value." 12:15:33 ) 12:15:33 elif isinstance(timeout, TimeoutSauce): 12:15:33 pass 12:15:33 else: 12:15:33 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:33 12:15:33 try: 12:15:33 resp = conn.urlopen( 12:15:33 method=request.method, 12:15:33 url=url, 12:15:33 body=request.body, 12:15:33 headers=request.headers, 12:15:33 redirect=False, 12:15:33 assert_same_host=False, 12:15:33 preload_content=False, 12:15:33 decode_content=False, 12:15:33 retries=self.max_retries, 12:15:33 timeout=timeout, 12:15:33 chunked=chunked, 12:15:33 ) 12:15:33 12:15:33 except (ProtocolError, OSError) as err: 12:15:33 raise ConnectionError(err, request=request) 12:15:33 12:15:33 except MaxRetryError as e: 12:15:33 if isinstance(e.reason, ConnectTimeoutError): 12:15:33 # TODO: Remove this in 3.0.0: see #2811 12:15:33 if not isinstance(e.reason, NewConnectionError): 12:15:33 raise ConnectTimeout(e, request=request) 12:15:33 12:15:33 if isinstance(e.reason, ResponseError): 12:15:33 raise RetryError(e, request=request) 12:15:33 12:15:33 if isinstance(e.reason, _ProxyError): 12:15:33 raise ProxyError(e, request=request) 12:15:33 12:15:33 if isinstance(e.reason, _SSLError): 12:15:33 # This branch is for urllib3 v1.22 and later. 12:15:33 raise SSLError(e, request=request) 12:15:33 12:15:33 > raise ConnectionError(e, request=request) 12:15:33 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 12:15:33 ----------------------------- Captured stdout call ----------------------------- 12:15:33 execution of test_16_xpdr_device_disconnection 12:15:33 _______ TransportPCEPortMappingTesting.test_17_xpdr_device_disconnected ________ 12:15:33 12:15:33 self = 12:15:33 12:15:33 def _new_conn(self) -> socket.socket: 12:15:33 """Establish a socket connection and set nodelay settings on it. 12:15:33 12:15:33 :return: New socket connection. 12:15:33 """ 12:15:33 try: 12:15:33 > sock = connection.create_connection( 12:15:33 (self._dns_host, self.port), 12:15:33 self.timeout, 12:15:33 source_address=self.source_address, 12:15:33 socket_options=self.socket_options, 12:15:33 ) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:15:33 raise err 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 address = ('localhost', 8182), timeout = 10, source_address = None 12:15:33 socket_options = [(6, 1, 1)] 12:15:33 12:15:33 def create_connection( 12:15:33 address: tuple[str, int], 12:15:33 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:33 source_address: tuple[str, int] | None = None, 12:15:33 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:15:33 ) -> socket.socket: 12:15:33 """Connect to *address* and return the socket object. 12:15:33 12:15:33 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:15:33 port)``) and return the socket object. Passing the optional 12:15:33 *timeout* parameter will set the timeout on the socket instance 12:15:33 before attempting to connect. If no *timeout* is supplied, the 12:15:33 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:15:33 is used. If *source_address* is set it must be a tuple of (host, port) 12:15:33 for the socket to bind as a source address before making the connection. 12:15:33 An host of '' or port 0 tells the OS to use the default. 12:15:33 """ 12:15:33 12:15:33 host, port = address 12:15:33 if host.startswith("["): 12:15:33 host = host.strip("[]") 12:15:33 err = None 12:15:33 12:15:33 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:15:33 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:15:33 # The original create_connection function always returns all records. 12:15:33 family = allowed_gai_family() 12:15:33 12:15:33 try: 12:15:33 host.encode("idna") 12:15:33 except UnicodeError: 12:15:33 raise LocationParseError(f"'{host}', label empty or too long") from None 12:15:33 12:15:33 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:15:33 af, socktype, proto, canonname, sa = res 12:15:33 sock = None 12:15:33 try: 12:15:33 sock = socket.socket(af, socktype, proto) 12:15:33 12:15:33 # If provided, set socket level options before connecting. 12:15:33 _set_socket_options(sock, socket_options) 12:15:33 12:15:33 if timeout is not _DEFAULT_TIMEOUT: 12:15:33 sock.settimeout(timeout) 12:15:33 if source_address: 12:15:33 sock.bind(source_address) 12:15:33 > sock.connect(sa) 12:15:33 E ConnectionRefusedError: [Errno 111] Connection refused 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:15:33 12:15:33 The above exception was the direct cause of the following exception: 12:15:33 12:15:33 self = 12:15:33 method = 'GET' 12:15:33 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01?content=nonconfig' 12:15:33 body = None 12:15:33 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:15:33 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:33 redirect = False, assert_same_host = False 12:15:33 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 12:15:33 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:15:33 decode_content = False, response_kw = {} 12:15:33 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01', query='content=nonconfig', fragment=None) 12:15:33 destination_scheme = None, conn = None, release_this_conn = True 12:15:33 http_tunnel_required = False, err = None, clean_exit = False 12:15:33 12:15:33 def urlopen( # type: ignore[override] 12:15:33 self, 12:15:33 method: str, 12:15:33 url: str, 12:15:33 body: _TYPE_BODY | None = None, 12:15:33 headers: typing.Mapping[str, str] | None = None, 12:15:33 retries: Retry | bool | int | None = None, 12:15:33 redirect: bool = True, 12:15:33 assert_same_host: bool = True, 12:15:33 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:33 pool_timeout: int | None = None, 12:15:33 release_conn: bool | None = None, 12:15:33 chunked: bool = False, 12:15:33 body_pos: _TYPE_BODY_POSITION | None = None, 12:15:33 preload_content: bool = True, 12:15:33 decode_content: bool = True, 12:15:33 **response_kw: typing.Any, 12:15:33 ) -> BaseHTTPResponse: 12:15:33 """ 12:15:33 Get a connection from the pool and perform an HTTP request. This is the 12:15:33 lowest level call for making a request, so you'll need to specify all 12:15:33 the raw details. 12:15:33 12:15:33 .. note:: 12:15:33 12:15:33 More commonly, it's appropriate to use a convenience method 12:15:33 such as :meth:`request`. 12:15:33 12:15:33 .. note:: 12:15:33 12:15:33 `release_conn` will only behave as expected if 12:15:33 `preload_content=False` because we want to make 12:15:33 `preload_content=False` the default behaviour someday soon without 12:15:33 breaking backwards compatibility. 12:15:33 12:15:33 :param method: 12:15:33 HTTP request method (such as GET, POST, PUT, etc.) 12:15:33 12:15:33 :param url: 12:15:33 The URL to perform the request on. 12:15:33 12:15:33 :param body: 12:15:33 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:15:33 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:15:33 12:15:33 :param headers: 12:15:33 Dictionary of custom headers to send, such as User-Agent, 12:15:33 If-None-Match, etc. If None, pool headers are used. If provided, 12:15:33 these headers completely replace any pool-specific headers. 12:15:33 12:15:33 :param retries: 12:15:33 Configure the number of retries to allow before raising a 12:15:33 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:15:33 12:15:33 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:15:33 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:15:33 over different types of retries. 12:15:33 Pass an integer number to retry connection errors that many times, 12:15:33 but no other types of errors. Pass zero to never retry. 12:15:33 12:15:33 If ``False``, then retries are disabled and any exception is raised 12:15:33 immediately. Also, instead of raising a MaxRetryError on redirects, 12:15:33 the redirect response will be returned. 12:15:33 12:15:33 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:15:33 12:15:33 :param redirect: 12:15:33 If True, automatically handle redirects (status codes 301, 302, 12:15:33 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:15:33 will disable redirect, too. 12:15:33 12:15:33 :param assert_same_host: 12:15:33 If ``True``, will make sure that the host of the pool requests is 12:15:33 consistent else will raise HostChangedError. When ``False``, you can 12:15:33 use the pool on an HTTP proxy and request foreign hosts. 12:15:33 12:15:33 :param timeout: 12:15:33 If specified, overrides the default timeout for this one 12:15:33 request. It may be a float (in seconds) or an instance of 12:15:33 :class:`urllib3.util.Timeout`. 12:15:33 12:15:33 :param pool_timeout: 12:15:33 If set and the pool is set to block=True, then this method will 12:15:33 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:15:33 connection is available within the time period. 12:15:33 12:15:33 :param bool preload_content: 12:15:33 If True, the response's body will be preloaded into memory. 12:15:33 12:15:33 :param bool decode_content: 12:15:33 If True, will attempt to decode the body based on the 12:15:33 'content-encoding' header. 12:15:33 12:15:33 :param release_conn: 12:15:33 If False, then the urlopen call will not release the connection 12:15:33 back into the pool once a response is received (but will release if 12:15:33 you read the entire contents of the response such as when 12:15:33 `preload_content=True`). This is useful if you're not preloading 12:15:33 the response's content immediately. You will need to call 12:15:33 ``r.release_conn()`` on the response ``r`` to return the connection 12:15:33 back into the pool. If None, it takes the value of ``preload_content`` 12:15:33 which defaults to ``True``. 12:15:33 12:15:33 :param bool chunked: 12:15:33 If True, urllib3 will send the body using chunked transfer 12:15:33 encoding. Otherwise, urllib3 will send the body using the standard 12:15:33 content-length form. Defaults to False. 12:15:33 12:15:33 :param int body_pos: 12:15:33 Position to seek to in file-like body in the event of a retry or 12:15:33 redirect. Typically this won't need to be set because urllib3 will 12:15:33 auto-populate the value when needed. 12:15:33 """ 12:15:33 parsed_url = parse_url(url) 12:15:33 destination_scheme = parsed_url.scheme 12:15:33 12:15:33 if headers is None: 12:15:33 headers = self.headers 12:15:33 12:15:33 if not isinstance(retries, Retry): 12:15:33 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:15:33 12:15:33 if release_conn is None: 12:15:33 release_conn = preload_content 12:15:33 12:15:33 # Check host 12:15:33 if assert_same_host and not self.is_same_host(url): 12:15:33 raise HostChangedError(self, url, retries) 12:15:33 12:15:33 # Ensure that the URL we're connecting to is properly encoded 12:15:33 if url.startswith("/"): 12:15:33 url = to_str(_encode_target(url)) 12:15:33 else: 12:15:33 url = to_str(parsed_url.url) 12:15:33 12:15:33 conn = None 12:15:33 12:15:33 # Track whether `conn` needs to be released before 12:15:33 # returning/raising/recursing. Update this variable if necessary, and 12:15:33 # leave `release_conn` constant throughout the function. That way, if 12:15:33 # the function recurses, the original value of `release_conn` will be 12:15:33 # passed down into the recursive call, and its value will be respected. 12:15:33 # 12:15:33 # See issue #651 [1] for details. 12:15:33 # 12:15:33 # [1] 12:15:33 release_this_conn = release_conn 12:15:33 12:15:33 http_tunnel_required = connection_requires_http_tunnel( 12:15:33 self.proxy, self.proxy_config, destination_scheme 12:15:33 ) 12:15:33 12:15:33 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:15:33 # have to copy the headers dict so we can safely change it without those 12:15:33 # changes being reflected in anyone else's copy. 12:15:33 if not http_tunnel_required: 12:15:33 headers = headers.copy() # type: ignore[attr-defined] 12:15:33 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:15:33 12:15:33 # Must keep the exception bound to a separate variable or else Python 3 12:15:33 # complains about UnboundLocalError. 12:15:33 err = None 12:15:33 12:15:33 # Keep track of whether we cleanly exited the except block. This 12:15:33 # ensures we do proper cleanup in finally. 12:15:33 clean_exit = False 12:15:33 12:15:33 # Rewind body position, if needed. Record current position 12:15:33 # for future rewinds in the event of a redirect/retry. 12:15:33 body_pos = set_file_position(body, body_pos) 12:15:33 12:15:33 try: 12:15:33 # Request a connection from the queue. 12:15:33 timeout_obj = self._get_timeout(timeout) 12:15:33 conn = self._get_conn(timeout=pool_timeout) 12:15:33 12:15:33 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:15:33 12:15:33 # Is this a closed/new connection that requires CONNECT tunnelling? 12:15:33 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:15:33 try: 12:15:33 self._prepare_proxy(conn) 12:15:33 except (BaseSSLError, OSError, SocketTimeout) as e: 12:15:33 self._raise_timeout( 12:15:33 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:15:33 ) 12:15:33 raise 12:15:33 12:15:33 # If we're going to release the connection in ``finally:``, then 12:15:33 # the response doesn't need to know about the connection. Otherwise 12:15:33 # it will also try to release it and we'll have a double-release 12:15:33 # mess. 12:15:33 response_conn = conn if not release_conn else None 12:15:33 12:15:33 # Make the request on the HTTPConnection object 12:15:33 > response = self._make_request( 12:15:33 conn, 12:15:33 method, 12:15:33 url, 12:15:33 timeout=timeout_obj, 12:15:33 body=body, 12:15:33 headers=headers, 12:15:33 chunked=chunked, 12:15:33 retries=retries, 12:15:33 response_conn=response_conn, 12:15:33 preload_content=preload_content, 12:15:33 decode_content=decode_content, 12:15:33 **response_kw, 12:15:33 ) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 12:15:33 conn.request( 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 12:15:33 self.endheaders() 12:15:33 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 12:15:33 self._send_output(message_body, encode_chunked=encode_chunked) 12:15:33 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 12:15:33 self.send(msg) 12:15:33 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 12:15:33 self.connect() 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 12:15:33 self.sock = self._new_conn() 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 self = 12:15:33 12:15:33 def _new_conn(self) -> socket.socket: 12:15:33 """Establish a socket connection and set nodelay settings on it. 12:15:33 12:15:33 :return: New socket connection. 12:15:33 """ 12:15:33 try: 12:15:33 sock = connection.create_connection( 12:15:33 (self._dns_host, self.port), 12:15:33 self.timeout, 12:15:33 source_address=self.source_address, 12:15:33 socket_options=self.socket_options, 12:15:33 ) 12:15:33 except socket.gaierror as e: 12:15:33 raise NameResolutionError(self.host, self, e) from e 12:15:33 except SocketTimeout as e: 12:15:33 raise ConnectTimeoutError( 12:15:33 self, 12:15:33 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:15:33 ) from e 12:15:33 12:15:33 except OSError as e: 12:15:33 > raise NewConnectionError( 12:15:33 self, f"Failed to establish a new connection: {e}" 12:15:33 ) from e 12:15:33 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 12:15:33 12:15:33 The above exception was the direct cause of the following exception: 12:15:33 12:15:33 self = 12:15:33 request = , stream = False 12:15:33 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:33 proxies = OrderedDict() 12:15:33 12:15:33 def send( 12:15:33 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:33 ): 12:15:33 """Sends PreparedRequest object. Returns Response object. 12:15:33 12:15:33 :param request: The :class:`PreparedRequest ` being sent. 12:15:33 :param stream: (optional) Whether to stream the request content. 12:15:33 :param timeout: (optional) How long to wait for the server to send 12:15:33 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:33 read timeout) ` tuple. 12:15:33 :type timeout: float or tuple or urllib3 Timeout object 12:15:33 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:33 we verify the server's TLS certificate, or a string, in which case it 12:15:33 must be a path to a CA bundle to use 12:15:33 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:33 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:33 :rtype: requests.Response 12:15:33 """ 12:15:33 12:15:33 try: 12:15:33 conn = self.get_connection_with_tls_context( 12:15:33 request, verify, proxies=proxies, cert=cert 12:15:33 ) 12:15:33 except LocationValueError as e: 12:15:33 raise InvalidURL(e, request=request) 12:15:33 12:15:33 self.cert_verify(conn, request.url, verify, cert) 12:15:33 url = self.request_url(request, proxies) 12:15:33 self.add_headers( 12:15:33 request, 12:15:33 stream=stream, 12:15:33 timeout=timeout, 12:15:33 verify=verify, 12:15:33 cert=cert, 12:15:33 proxies=proxies, 12:15:33 ) 12:15:33 12:15:33 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:33 12:15:33 if isinstance(timeout, tuple): 12:15:33 try: 12:15:33 connect, read = timeout 12:15:33 timeout = TimeoutSauce(connect=connect, read=read) 12:15:33 except ValueError: 12:15:33 raise ValueError( 12:15:33 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:33 f"or a single float to set both timeouts to the same value." 12:15:33 ) 12:15:33 elif isinstance(timeout, TimeoutSauce): 12:15:33 pass 12:15:33 else: 12:15:33 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:33 12:15:33 try: 12:15:33 > resp = conn.urlopen( 12:15:33 method=request.method, 12:15:33 url=url, 12:15:33 body=request.body, 12:15:33 headers=request.headers, 12:15:33 redirect=False, 12:15:33 assert_same_host=False, 12:15:33 preload_content=False, 12:15:33 decode_content=False, 12:15:33 retries=self.max_retries, 12:15:33 timeout=timeout, 12:15:33 chunked=chunked, 12:15:33 ) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 12:15:33 retries = retries.increment( 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:33 method = 'GET' 12:15:33 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01?content=nonconfig' 12:15:33 response = None 12:15:33 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 12:15:33 _pool = 12:15:33 _stacktrace = 12:15:33 12:15:33 def increment( 12:15:33 self, 12:15:33 method: str | None = None, 12:15:33 url: str | None = None, 12:15:33 response: BaseHTTPResponse | None = None, 12:15:33 error: Exception | None = None, 12:15:33 _pool: ConnectionPool | None = None, 12:15:33 _stacktrace: TracebackType | None = None, 12:15:33 ) -> Self: 12:15:33 """Return a new Retry object with incremented retry counters. 12:15:33 12:15:33 :param response: A response object, or None, if the server did not 12:15:33 return a response. 12:15:33 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:15:33 :param Exception error: An error encountered during the request, or 12:15:33 None if the response was received successfully. 12:15:33 12:15:33 :return: A new ``Retry`` object. 12:15:33 """ 12:15:33 if self.total is False and error: 12:15:33 # Disabled, indicate to re-raise the error. 12:15:33 raise reraise(type(error), error, _stacktrace) 12:15:33 12:15:33 total = self.total 12:15:33 if total is not None: 12:15:33 total -= 1 12:15:33 12:15:33 connect = self.connect 12:15:33 read = self.read 12:15:33 redirect = self.redirect 12:15:33 status_count = self.status 12:15:33 other = self.other 12:15:33 cause = "unknown" 12:15:33 status = None 12:15:33 redirect_location = None 12:15:33 12:15:33 if error and self._is_connection_error(error): 12:15:33 # Connect retry? 12:15:33 if connect is False: 12:15:33 raise reraise(type(error), error, _stacktrace) 12:15:33 elif connect is not None: 12:15:33 connect -= 1 12:15:33 12:15:33 elif error and self._is_read_error(error): 12:15:33 # Read retry? 12:15:33 if read is False or method is None or not self._is_method_retryable(method): 12:15:33 raise reraise(type(error), error, _stacktrace) 12:15:33 elif read is not None: 12:15:33 read -= 1 12:15:33 12:15:33 elif error: 12:15:33 # Other retry? 12:15:33 if other is not None: 12:15:33 other -= 1 12:15:33 12:15:33 elif response and response.get_redirect_location(): 12:15:33 # Redirect retry? 12:15:33 if redirect is not None: 12:15:33 redirect -= 1 12:15:33 cause = "too many redirects" 12:15:33 response_redirect_location = response.get_redirect_location() 12:15:33 if response_redirect_location: 12:15:33 redirect_location = response_redirect_location 12:15:33 status = response.status 12:15:33 12:15:33 else: 12:15:33 # Incrementing because of a server error like a 500 in 12:15:33 # status_forcelist and the given method is in the allowed_methods 12:15:33 cause = ResponseError.GENERIC_ERROR 12:15:33 if response and response.status: 12:15:33 if status_count is not None: 12:15:33 status_count -= 1 12:15:33 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:15:33 status = response.status 12:15:33 12:15:33 history = self.history + ( 12:15:33 RequestHistory(method, url, error, status, redirect_location), 12:15:33 ) 12:15:33 12:15:33 new_retry = self.new( 12:15:33 total=total, 12:15:33 connect=connect, 12:15:33 read=read, 12:15:33 redirect=redirect, 12:15:33 status=status_count, 12:15:33 other=other, 12:15:33 history=history, 12:15:33 ) 12:15:33 12:15:33 if new_retry.is_exhausted(): 12:15:33 reason = error or ResponseError(cause) 12:15:33 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:15:33 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 12:15:33 12:15:33 During handling of the above exception, another exception occurred: 12:15:33 12:15:33 self = 12:15:33 12:15:33 def test_17_xpdr_device_disconnected(self): 12:15:33 > response = test_utils.check_device_connection("XPDRA01") 12:15:33 12:15:33 transportpce_tests/1.2.1/test01_portmapping.py:195: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 transportpce_tests/common/test_utils.py:369: in check_device_connection 12:15:33 response = get_request(url[RESTCONF_VERSION].format('{}', node)) 12:15:33 transportpce_tests/common/test_utils.py:116: in get_request 12:15:33 return requests.request( 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:15:33 return session.request(method=method, url=url, **kwargs) 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:15:33 resp = self.send(prep, **send_kwargs) 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:15:33 r = adapter.send(request, **kwargs) 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 self = 12:15:33 request = , stream = False 12:15:33 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:33 proxies = OrderedDict() 12:15:33 12:15:33 def send( 12:15:33 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:33 ): 12:15:33 """Sends PreparedRequest object. Returns Response object. 12:15:33 12:15:33 :param request: The :class:`PreparedRequest ` being sent. 12:15:33 :param stream: (optional) Whether to stream the request content. 12:15:33 :param timeout: (optional) How long to wait for the server to send 12:15:33 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:33 read timeout) ` tuple. 12:15:33 :type timeout: float or tuple or urllib3 Timeout object 12:15:33 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:33 we verify the server's TLS certificate, or a string, in which case it 12:15:33 must be a path to a CA bundle to use 12:15:33 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:33 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:33 :rtype: requests.Response 12:15:33 """ 12:15:33 12:15:33 try: 12:15:33 conn = self.get_connection_with_tls_context( 12:15:33 request, verify, proxies=proxies, cert=cert 12:15:33 ) 12:15:33 except LocationValueError as e: 12:15:33 raise InvalidURL(e, request=request) 12:15:33 12:15:33 self.cert_verify(conn, request.url, verify, cert) 12:15:33 url = self.request_url(request, proxies) 12:15:33 self.add_headers( 12:15:33 request, 12:15:33 stream=stream, 12:15:33 timeout=timeout, 12:15:33 verify=verify, 12:15:33 cert=cert, 12:15:33 proxies=proxies, 12:15:33 ) 12:15:33 12:15:33 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:33 12:15:33 if isinstance(timeout, tuple): 12:15:33 try: 12:15:33 connect, read = timeout 12:15:33 timeout = TimeoutSauce(connect=connect, read=read) 12:15:33 except ValueError: 12:15:33 raise ValueError( 12:15:33 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:33 f"or a single float to set both timeouts to the same value." 12:15:33 ) 12:15:33 elif isinstance(timeout, TimeoutSauce): 12:15:33 pass 12:15:33 else: 12:15:33 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:33 12:15:33 try: 12:15:33 resp = conn.urlopen( 12:15:33 method=request.method, 12:15:33 url=url, 12:15:33 body=request.body, 12:15:33 headers=request.headers, 12:15:33 redirect=False, 12:15:33 assert_same_host=False, 12:15:33 preload_content=False, 12:15:33 decode_content=False, 12:15:33 retries=self.max_retries, 12:15:33 timeout=timeout, 12:15:33 chunked=chunked, 12:15:33 ) 12:15:33 12:15:33 except (ProtocolError, OSError) as err: 12:15:33 raise ConnectionError(err, request=request) 12:15:33 12:15:33 except MaxRetryError as e: 12:15:33 if isinstance(e.reason, ConnectTimeoutError): 12:15:33 # TODO: Remove this in 3.0.0: see #2811 12:15:33 if not isinstance(e.reason, NewConnectionError): 12:15:33 raise ConnectTimeout(e, request=request) 12:15:33 12:15:33 if isinstance(e.reason, ResponseError): 12:15:33 raise RetryError(e, request=request) 12:15:33 12:15:33 if isinstance(e.reason, _ProxyError): 12:15:33 raise ProxyError(e, request=request) 12:15:33 12:15:33 if isinstance(e.reason, _SSLError): 12:15:33 # This branch is for urllib3 v1.22 and later. 12:15:33 raise SSLError(e, request=request) 12:15:33 12:15:33 > raise ConnectionError(e, request=request) 12:15:33 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 12:15:33 ----------------------------- Captured stdout call ----------------------------- 12:15:33 execution of test_17_xpdr_device_disconnected 12:15:33 _______ TransportPCEPortMappingTesting.test_18_xpdr_device_not_connected _______ 12:15:33 12:15:33 self = 12:15:33 12:15:33 def _new_conn(self) -> socket.socket: 12:15:33 """Establish a socket connection and set nodelay settings on it. 12:15:33 12:15:33 :return: New socket connection. 12:15:33 """ 12:15:33 try: 12:15:33 > sock = connection.create_connection( 12:15:33 (self._dns_host, self.port), 12:15:33 self.timeout, 12:15:33 source_address=self.source_address, 12:15:33 socket_options=self.socket_options, 12:15:33 ) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:15:33 raise err 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 address = ('localhost', 8182), timeout = 10, source_address = None 12:15:33 socket_options = [(6, 1, 1)] 12:15:33 12:15:33 def create_connection( 12:15:33 address: tuple[str, int], 12:15:33 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:33 source_address: tuple[str, int] | None = None, 12:15:33 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:15:33 ) -> socket.socket: 12:15:33 """Connect to *address* and return the socket object. 12:15:33 12:15:33 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:15:33 port)``) and return the socket object. Passing the optional 12:15:33 *timeout* parameter will set the timeout on the socket instance 12:15:33 before attempting to connect. If no *timeout* is supplied, the 12:15:33 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:15:33 is used. If *source_address* is set it must be a tuple of (host, port) 12:15:33 for the socket to bind as a source address before making the connection. 12:15:33 An host of '' or port 0 tells the OS to use the default. 12:15:33 """ 12:15:33 12:15:33 host, port = address 12:15:33 if host.startswith("["): 12:15:33 host = host.strip("[]") 12:15:33 err = None 12:15:33 12:15:33 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:15:33 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:15:33 # The original create_connection function always returns all records. 12:15:33 family = allowed_gai_family() 12:15:33 12:15:33 try: 12:15:33 host.encode("idna") 12:15:33 except UnicodeError: 12:15:33 raise LocationParseError(f"'{host}', label empty or too long") from None 12:15:33 12:15:33 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:15:33 af, socktype, proto, canonname, sa = res 12:15:33 sock = None 12:15:33 try: 12:15:33 sock = socket.socket(af, socktype, proto) 12:15:33 12:15:33 # If provided, set socket level options before connecting. 12:15:33 _set_socket_options(sock, socket_options) 12:15:33 12:15:33 if timeout is not _DEFAULT_TIMEOUT: 12:15:33 sock.settimeout(timeout) 12:15:33 if source_address: 12:15:33 sock.bind(source_address) 12:15:33 > sock.connect(sa) 12:15:33 E ConnectionRefusedError: [Errno 111] Connection refused 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:15:33 12:15:33 The above exception was the direct cause of the following exception: 12:15:33 12:15:33 self = 12:15:33 method = 'GET' 12:15:33 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info' 12:15:33 body = None 12:15:33 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:15:33 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:33 redirect = False, assert_same_host = False 12:15:33 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 12:15:33 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:15:33 decode_content = False, response_kw = {} 12:15:33 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info', query=None, fragment=None) 12:15:33 destination_scheme = None, conn = None, release_this_conn = True 12:15:33 http_tunnel_required = False, err = None, clean_exit = False 12:15:33 12:15:33 def urlopen( # type: ignore[override] 12:15:33 self, 12:15:33 method: str, 12:15:33 url: str, 12:15:33 body: _TYPE_BODY | None = None, 12:15:33 headers: typing.Mapping[str, str] | None = None, 12:15:33 retries: Retry | bool | int | None = None, 12:15:33 redirect: bool = True, 12:15:33 assert_same_host: bool = True, 12:15:33 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:33 pool_timeout: int | None = None, 12:15:33 release_conn: bool | None = None, 12:15:33 chunked: bool = False, 12:15:33 body_pos: _TYPE_BODY_POSITION | None = None, 12:15:33 preload_content: bool = True, 12:15:33 decode_content: bool = True, 12:15:33 **response_kw: typing.Any, 12:15:33 ) -> BaseHTTPResponse: 12:15:33 """ 12:15:33 Get a connection from the pool and perform an HTTP request. This is the 12:15:33 lowest level call for making a request, so you'll need to specify all 12:15:33 the raw details. 12:15:33 12:15:33 .. note:: 12:15:33 12:15:33 More commonly, it's appropriate to use a convenience method 12:15:33 such as :meth:`request`. 12:15:33 12:15:33 .. note:: 12:15:33 12:15:33 `release_conn` will only behave as expected if 12:15:33 `preload_content=False` because we want to make 12:15:33 `preload_content=False` the default behaviour someday soon without 12:15:33 breaking backwards compatibility. 12:15:33 12:15:33 :param method: 12:15:33 HTTP request method (such as GET, POST, PUT, etc.) 12:15:33 12:15:33 :param url: 12:15:33 The URL to perform the request on. 12:15:33 12:15:33 :param body: 12:15:33 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:15:33 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:15:33 12:15:33 :param headers: 12:15:33 Dictionary of custom headers to send, such as User-Agent, 12:15:33 If-None-Match, etc. If None, pool headers are used. If provided, 12:15:33 these headers completely replace any pool-specific headers. 12:15:33 12:15:33 :param retries: 12:15:33 Configure the number of retries to allow before raising a 12:15:33 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:15:33 12:15:33 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:15:33 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:15:33 over different types of retries. 12:15:33 Pass an integer number to retry connection errors that many times, 12:15:33 but no other types of errors. Pass zero to never retry. 12:15:33 12:15:33 If ``False``, then retries are disabled and any exception is raised 12:15:33 immediately. Also, instead of raising a MaxRetryError on redirects, 12:15:33 the redirect response will be returned. 12:15:33 12:15:33 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:15:33 12:15:33 :param redirect: 12:15:33 If True, automatically handle redirects (status codes 301, 302, 12:15:33 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:15:33 will disable redirect, too. 12:15:33 12:15:33 :param assert_same_host: 12:15:33 If ``True``, will make sure that the host of the pool requests is 12:15:33 consistent else will raise HostChangedError. When ``False``, you can 12:15:33 use the pool on an HTTP proxy and request foreign hosts. 12:15:33 12:15:33 :param timeout: 12:15:33 If specified, overrides the default timeout for this one 12:15:33 request. It may be a float (in seconds) or an instance of 12:15:33 :class:`urllib3.util.Timeout`. 12:15:33 12:15:33 :param pool_timeout: 12:15:33 If set and the pool is set to block=True, then this method will 12:15:33 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:15:33 connection is available within the time period. 12:15:33 12:15:33 :param bool preload_content: 12:15:33 If True, the response's body will be preloaded into memory. 12:15:33 12:15:33 :param bool decode_content: 12:15:33 If True, will attempt to decode the body based on the 12:15:33 'content-encoding' header. 12:15:33 12:15:33 :param release_conn: 12:15:33 If False, then the urlopen call will not release the connection 12:15:33 back into the pool once a response is received (but will release if 12:15:33 you read the entire contents of the response such as when 12:15:33 `preload_content=True`). This is useful if you're not preloading 12:15:33 the response's content immediately. You will need to call 12:15:33 ``r.release_conn()`` on the response ``r`` to return the connection 12:15:33 back into the pool. If None, it takes the value of ``preload_content`` 12:15:33 which defaults to ``True``. 12:15:33 12:15:33 :param bool chunked: 12:15:33 If True, urllib3 will send the body using chunked transfer 12:15:33 encoding. Otherwise, urllib3 will send the body using the standard 12:15:33 content-length form. Defaults to False. 12:15:33 12:15:33 :param int body_pos: 12:15:33 Position to seek to in file-like body in the event of a retry or 12:15:33 redirect. Typically this won't need to be set because urllib3 will 12:15:33 auto-populate the value when needed. 12:15:33 """ 12:15:33 parsed_url = parse_url(url) 12:15:33 destination_scheme = parsed_url.scheme 12:15:33 12:15:33 if headers is None: 12:15:33 headers = self.headers 12:15:33 12:15:33 if not isinstance(retries, Retry): 12:15:33 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:15:33 12:15:33 if release_conn is None: 12:15:33 release_conn = preload_content 12:15:33 12:15:33 # Check host 12:15:33 if assert_same_host and not self.is_same_host(url): 12:15:33 raise HostChangedError(self, url, retries) 12:15:33 12:15:33 # Ensure that the URL we're connecting to is properly encoded 12:15:33 if url.startswith("/"): 12:15:33 url = to_str(_encode_target(url)) 12:15:33 else: 12:15:33 url = to_str(parsed_url.url) 12:15:33 12:15:33 conn = None 12:15:33 12:15:33 # Track whether `conn` needs to be released before 12:15:33 # returning/raising/recursing. Update this variable if necessary, and 12:15:33 # leave `release_conn` constant throughout the function. That way, if 12:15:33 # the function recurses, the original value of `release_conn` will be 12:15:33 # passed down into the recursive call, and its value will be respected. 12:15:33 # 12:15:33 # See issue #651 [1] for details. 12:15:33 # 12:15:33 # [1] 12:15:33 release_this_conn = release_conn 12:15:33 12:15:33 http_tunnel_required = connection_requires_http_tunnel( 12:15:33 self.proxy, self.proxy_config, destination_scheme 12:15:33 ) 12:15:33 12:15:33 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:15:33 # have to copy the headers dict so we can safely change it without those 12:15:33 # changes being reflected in anyone else's copy. 12:15:33 if not http_tunnel_required: 12:15:33 headers = headers.copy() # type: ignore[attr-defined] 12:15:33 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:15:33 12:15:33 # Must keep the exception bound to a separate variable or else Python 3 12:15:33 # complains about UnboundLocalError. 12:15:33 err = None 12:15:33 12:15:33 # Keep track of whether we cleanly exited the except block. This 12:15:33 # ensures we do proper cleanup in finally. 12:15:33 clean_exit = False 12:15:33 12:15:33 # Rewind body position, if needed. Record current position 12:15:33 # for future rewinds in the event of a redirect/retry. 12:15:33 body_pos = set_file_position(body, body_pos) 12:15:33 12:15:33 try: 12:15:33 # Request a connection from the queue. 12:15:33 timeout_obj = self._get_timeout(timeout) 12:15:33 conn = self._get_conn(timeout=pool_timeout) 12:15:33 12:15:33 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:15:33 12:15:33 # Is this a closed/new connection that requires CONNECT tunnelling? 12:15:33 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:15:33 try: 12:15:33 self._prepare_proxy(conn) 12:15:33 except (BaseSSLError, OSError, SocketTimeout) as e: 12:15:33 self._raise_timeout( 12:15:33 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:15:33 ) 12:15:33 raise 12:15:33 12:15:33 # If we're going to release the connection in ``finally:``, then 12:15:33 # the response doesn't need to know about the connection. Otherwise 12:15:33 # it will also try to release it and we'll have a double-release 12:15:33 # mess. 12:15:33 response_conn = conn if not release_conn else None 12:15:33 12:15:33 # Make the request on the HTTPConnection object 12:15:33 > response = self._make_request( 12:15:33 conn, 12:15:33 method, 12:15:33 url, 12:15:33 timeout=timeout_obj, 12:15:33 body=body, 12:15:33 headers=headers, 12:15:33 chunked=chunked, 12:15:33 retries=retries, 12:15:33 response_conn=response_conn, 12:15:33 preload_content=preload_content, 12:15:33 decode_content=decode_content, 12:15:33 **response_kw, 12:15:33 ) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 12:15:33 conn.request( 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 12:15:33 self.endheaders() 12:15:33 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 12:15:33 self._send_output(message_body, encode_chunked=encode_chunked) 12:15:33 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 12:15:33 self.send(msg) 12:15:33 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 12:15:33 self.connect() 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 12:15:33 self.sock = self._new_conn() 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 self = 12:15:33 12:15:33 def _new_conn(self) -> socket.socket: 12:15:33 """Establish a socket connection and set nodelay settings on it. 12:15:33 12:15:33 :return: New socket connection. 12:15:33 """ 12:15:33 try: 12:15:33 sock = connection.create_connection( 12:15:33 (self._dns_host, self.port), 12:15:33 self.timeout, 12:15:33 source_address=self.source_address, 12:15:33 socket_options=self.socket_options, 12:15:33 ) 12:15:33 except socket.gaierror as e: 12:15:33 raise NameResolutionError(self.host, self, e) from e 12:15:33 except SocketTimeout as e: 12:15:33 raise ConnectTimeoutError( 12:15:33 self, 12:15:33 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:15:33 ) from e 12:15:33 12:15:33 except OSError as e: 12:15:33 > raise NewConnectionError( 12:15:33 self, f"Failed to establish a new connection: {e}" 12:15:33 ) from e 12:15:33 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 12:15:33 12:15:33 The above exception was the direct cause of the following exception: 12:15:33 12:15:33 self = 12:15:33 request = , stream = False 12:15:33 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:33 proxies = OrderedDict() 12:15:33 12:15:33 def send( 12:15:33 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:33 ): 12:15:33 """Sends PreparedRequest object. Returns Response object. 12:15:33 12:15:33 :param request: The :class:`PreparedRequest ` being sent. 12:15:33 :param stream: (optional) Whether to stream the request content. 12:15:33 :param timeout: (optional) How long to wait for the server to send 12:15:33 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:33 read timeout) ` tuple. 12:15:33 :type timeout: float or tuple or urllib3 Timeout object 12:15:33 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:33 we verify the server's TLS certificate, or a string, in which case it 12:15:33 must be a path to a CA bundle to use 12:15:33 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:33 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:33 :rtype: requests.Response 12:15:33 """ 12:15:33 12:15:33 try: 12:15:33 conn = self.get_connection_with_tls_context( 12:15:33 request, verify, proxies=proxies, cert=cert 12:15:33 ) 12:15:33 except LocationValueError as e: 12:15:33 raise InvalidURL(e, request=request) 12:15:33 12:15:33 self.cert_verify(conn, request.url, verify, cert) 12:15:33 url = self.request_url(request, proxies) 12:15:33 self.add_headers( 12:15:33 request, 12:15:33 stream=stream, 12:15:33 timeout=timeout, 12:15:33 verify=verify, 12:15:33 cert=cert, 12:15:33 proxies=proxies, 12:15:33 ) 12:15:33 12:15:33 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:33 12:15:33 if isinstance(timeout, tuple): 12:15:33 try: 12:15:33 connect, read = timeout 12:15:33 timeout = TimeoutSauce(connect=connect, read=read) 12:15:33 except ValueError: 12:15:33 raise ValueError( 12:15:33 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:33 f"or a single float to set both timeouts to the same value." 12:15:33 ) 12:15:33 elif isinstance(timeout, TimeoutSauce): 12:15:33 pass 12:15:33 else: 12:15:33 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:33 12:15:33 try: 12:15:33 > resp = conn.urlopen( 12:15:33 method=request.method, 12:15:33 url=url, 12:15:33 body=request.body, 12:15:33 headers=request.headers, 12:15:33 redirect=False, 12:15:33 assert_same_host=False, 12:15:33 preload_content=False, 12:15:33 decode_content=False, 12:15:33 retries=self.max_retries, 12:15:33 timeout=timeout, 12:15:33 chunked=chunked, 12:15:33 ) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 12:15:33 retries = retries.increment( 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:33 method = 'GET' 12:15:33 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info' 12:15:33 response = None 12:15:33 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 12:15:33 _pool = 12:15:33 _stacktrace = 12:15:33 12:15:33 def increment( 12:15:33 self, 12:15:33 method: str | None = None, 12:15:33 url: str | None = None, 12:15:33 response: BaseHTTPResponse | None = None, 12:15:33 error: Exception | None = None, 12:15:33 _pool: ConnectionPool | None = None, 12:15:33 _stacktrace: TracebackType | None = None, 12:15:33 ) -> Self: 12:15:33 """Return a new Retry object with incremented retry counters. 12:15:33 12:15:33 :param response: A response object, or None, if the server did not 12:15:33 return a response. 12:15:33 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:15:33 :param Exception error: An error encountered during the request, or 12:15:33 None if the response was received successfully. 12:15:33 12:15:33 :return: A new ``Retry`` object. 12:15:33 """ 12:15:33 if self.total is False and error: 12:15:33 # Disabled, indicate to re-raise the error. 12:15:33 raise reraise(type(error), error, _stacktrace) 12:15:33 12:15:33 total = self.total 12:15:33 if total is not None: 12:15:33 total -= 1 12:15:33 12:15:33 connect = self.connect 12:15:33 read = self.read 12:15:33 redirect = self.redirect 12:15:33 status_count = self.status 12:15:33 other = self.other 12:15:33 cause = "unknown" 12:15:33 status = None 12:15:33 redirect_location = None 12:15:33 12:15:33 if error and self._is_connection_error(error): 12:15:33 # Connect retry? 12:15:33 if connect is False: 12:15:33 raise reraise(type(error), error, _stacktrace) 12:15:33 elif connect is not None: 12:15:33 connect -= 1 12:15:33 12:15:33 elif error and self._is_read_error(error): 12:15:33 # Read retry? 12:15:33 if read is False or method is None or not self._is_method_retryable(method): 12:15:33 raise reraise(type(error), error, _stacktrace) 12:15:33 elif read is not None: 12:15:33 read -= 1 12:15:33 12:15:33 elif error: 12:15:33 # Other retry? 12:15:33 if other is not None: 12:15:33 other -= 1 12:15:33 12:15:33 elif response and response.get_redirect_location(): 12:15:33 # Redirect retry? 12:15:33 if redirect is not None: 12:15:33 redirect -= 1 12:15:33 cause = "too many redirects" 12:15:33 response_redirect_location = response.get_redirect_location() 12:15:33 if response_redirect_location: 12:15:33 redirect_location = response_redirect_location 12:15:33 status = response.status 12:15:33 12:15:33 else: 12:15:33 # Incrementing because of a server error like a 500 in 12:15:33 # status_forcelist and the given method is in the allowed_methods 12:15:33 cause = ResponseError.GENERIC_ERROR 12:15:33 if response and response.status: 12:15:33 if status_count is not None: 12:15:33 status_count -= 1 12:15:33 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:15:33 status = response.status 12:15:33 12:15:33 history = self.history + ( 12:15:33 RequestHistory(method, url, error, status, redirect_location), 12:15:33 ) 12:15:33 12:15:33 new_retry = self.new( 12:15:33 total=total, 12:15:33 connect=connect, 12:15:33 read=read, 12:15:33 redirect=redirect, 12:15:33 status=status_count, 12:15:33 other=other, 12:15:33 history=history, 12:15:33 ) 12:15:33 12:15:33 if new_retry.is_exhausted(): 12:15:33 reason = error or ResponseError(cause) 12:15:33 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:15:33 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 12:15:33 12:15:33 During handling of the above exception, another exception occurred: 12:15:33 12:15:33 self = 12:15:33 12:15:33 def test_18_xpdr_device_not_connected(self): 12:15:33 > response = test_utils.get_portmapping_node_attr("XPDRA01", "node-info", None) 12:15:33 12:15:33 transportpce_tests/1.2.1/test01_portmapping.py:203: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 transportpce_tests/common/test_utils.py:470: in get_portmapping_node_attr 12:15:33 response = get_request(target_url) 12:15:33 transportpce_tests/common/test_utils.py:116: in get_request 12:15:33 return requests.request( 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:15:33 return session.request(method=method, url=url, **kwargs) 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:15:33 resp = self.send(prep, **send_kwargs) 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:15:33 r = adapter.send(request, **kwargs) 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 self = 12:15:33 request = , stream = False 12:15:33 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:33 proxies = OrderedDict() 12:15:33 12:15:33 def send( 12:15:33 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:33 ): 12:15:33 """Sends PreparedRequest object. Returns Response object. 12:15:33 12:15:33 :param request: The :class:`PreparedRequest ` being sent. 12:15:33 :param stream: (optional) Whether to stream the request content. 12:15:33 :param timeout: (optional) How long to wait for the server to send 12:15:33 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:33 read timeout) ` tuple. 12:15:33 :type timeout: float or tuple or urllib3 Timeout object 12:15:33 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:33 we verify the server's TLS certificate, or a string, in which case it 12:15:33 must be a path to a CA bundle to use 12:15:33 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:33 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:33 :rtype: requests.Response 12:15:33 """ 12:15:33 12:15:33 try: 12:15:33 conn = self.get_connection_with_tls_context( 12:15:33 request, verify, proxies=proxies, cert=cert 12:15:33 ) 12:15:33 except LocationValueError as e: 12:15:33 raise InvalidURL(e, request=request) 12:15:33 12:15:33 self.cert_verify(conn, request.url, verify, cert) 12:15:33 url = self.request_url(request, proxies) 12:15:33 self.add_headers( 12:15:33 request, 12:15:33 stream=stream, 12:15:33 timeout=timeout, 12:15:33 verify=verify, 12:15:33 cert=cert, 12:15:33 proxies=proxies, 12:15:33 ) 12:15:33 12:15:33 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:33 12:15:33 if isinstance(timeout, tuple): 12:15:33 try: 12:15:33 connect, read = timeout 12:15:33 timeout = TimeoutSauce(connect=connect, read=read) 12:15:33 except ValueError: 12:15:33 raise ValueError( 12:15:33 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:33 f"or a single float to set both timeouts to the same value." 12:15:33 ) 12:15:33 elif isinstance(timeout, TimeoutSauce): 12:15:33 pass 12:15:33 else: 12:15:33 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:33 12:15:33 try: 12:15:33 resp = conn.urlopen( 12:15:33 method=request.method, 12:15:33 url=url, 12:15:33 body=request.body, 12:15:33 headers=request.headers, 12:15:33 redirect=False, 12:15:33 assert_same_host=False, 12:15:33 preload_content=False, 12:15:33 decode_content=False, 12:15:33 retries=self.max_retries, 12:15:33 timeout=timeout, 12:15:33 chunked=chunked, 12:15:33 ) 12:15:33 12:15:33 except (ProtocolError, OSError) as err: 12:15:33 raise ConnectionError(err, request=request) 12:15:33 12:15:33 except MaxRetryError as e: 12:15:33 if isinstance(e.reason, ConnectTimeoutError): 12:15:33 # TODO: Remove this in 3.0.0: see #2811 12:15:33 if not isinstance(e.reason, NewConnectionError): 12:15:33 raise ConnectTimeout(e, request=request) 12:15:33 12:15:33 if isinstance(e.reason, ResponseError): 12:15:33 raise RetryError(e, request=request) 12:15:33 12:15:33 if isinstance(e.reason, _ProxyError): 12:15:33 raise ProxyError(e, request=request) 12:15:33 12:15:33 if isinstance(e.reason, _SSLError): 12:15:33 # This branch is for urllib3 v1.22 and later. 12:15:33 raise SSLError(e, request=request) 12:15:33 12:15:33 > raise ConnectionError(e, request=request) 12:15:33 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 12:15:33 ----------------------------- Captured stdout call ----------------------------- 12:15:33 execution of test_18_xpdr_device_not_connected 12:15:33 _______ TransportPCEPortMappingTesting.test_19_rdm_device_disconnection ________ 12:15:33 12:15:33 self = 12:15:33 12:15:33 def _new_conn(self) -> socket.socket: 12:15:33 """Establish a socket connection and set nodelay settings on it. 12:15:33 12:15:33 :return: New socket connection. 12:15:33 """ 12:15:33 try: 12:15:33 > sock = connection.create_connection( 12:15:33 (self._dns_host, self.port), 12:15:33 self.timeout, 12:15:33 source_address=self.source_address, 12:15:33 socket_options=self.socket_options, 12:15:33 ) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:15:33 raise err 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 address = ('localhost', 8182), timeout = 10, source_address = None 12:15:33 socket_options = [(6, 1, 1)] 12:15:33 12:15:33 def create_connection( 12:15:33 address: tuple[str, int], 12:15:33 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:33 source_address: tuple[str, int] | None = None, 12:15:33 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:15:33 ) -> socket.socket: 12:15:33 """Connect to *address* and return the socket object. 12:15:33 12:15:33 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:15:33 port)``) and return the socket object. Passing the optional 12:15:33 *timeout* parameter will set the timeout on the socket instance 12:15:33 before attempting to connect. If no *timeout* is supplied, the 12:15:33 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:15:33 is used. If *source_address* is set it must be a tuple of (host, port) 12:15:33 for the socket to bind as a source address before making the connection. 12:15:33 An host of '' or port 0 tells the OS to use the default. 12:15:33 """ 12:15:33 12:15:33 host, port = address 12:15:33 if host.startswith("["): 12:15:33 host = host.strip("[]") 12:15:33 err = None 12:15:33 12:15:33 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:15:33 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:15:33 # The original create_connection function always returns all records. 12:15:33 family = allowed_gai_family() 12:15:33 12:15:33 try: 12:15:33 host.encode("idna") 12:15:33 except UnicodeError: 12:15:33 raise LocationParseError(f"'{host}', label empty or too long") from None 12:15:33 12:15:33 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:15:33 af, socktype, proto, canonname, sa = res 12:15:33 sock = None 12:15:33 try: 12:15:33 sock = socket.socket(af, socktype, proto) 12:15:33 12:15:33 # If provided, set socket level options before connecting. 12:15:33 _set_socket_options(sock, socket_options) 12:15:33 12:15:33 if timeout is not _DEFAULT_TIMEOUT: 12:15:33 sock.settimeout(timeout) 12:15:33 if source_address: 12:15:33 sock.bind(source_address) 12:15:33 > sock.connect(sa) 12:15:33 E ConnectionRefusedError: [Errno 111] Connection refused 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:15:33 12:15:33 The above exception was the direct cause of the following exception: 12:15:33 12:15:33 self = 12:15:33 method = 'DELETE' 12:15:33 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01' 12:15:33 body = None 12:15:33 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '0', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:15:33 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:33 redirect = False, assert_same_host = False 12:15:33 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 12:15:33 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:15:33 decode_content = False, response_kw = {} 12:15:33 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01', query=None, fragment=None) 12:15:33 destination_scheme = None, conn = None, release_this_conn = True 12:15:33 http_tunnel_required = False, err = None, clean_exit = False 12:15:33 12:15:33 def urlopen( # type: ignore[override] 12:15:33 self, 12:15:33 method: str, 12:15:33 url: str, 12:15:33 body: _TYPE_BODY | None = None, 12:15:33 headers: typing.Mapping[str, str] | None = None, 12:15:33 retries: Retry | bool | int | None = None, 12:15:33 redirect: bool = True, 12:15:33 assert_same_host: bool = True, 12:15:33 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:33 pool_timeout: int | None = None, 12:15:33 release_conn: bool | None = None, 12:15:33 chunked: bool = False, 12:15:33 body_pos: _TYPE_BODY_POSITION | None = None, 12:15:33 preload_content: bool = True, 12:15:33 decode_content: bool = True, 12:15:33 **response_kw: typing.Any, 12:15:33 ) -> BaseHTTPResponse: 12:15:33 """ 12:15:33 Get a connection from the pool and perform an HTTP request. This is the 12:15:33 lowest level call for making a request, so you'll need to specify all 12:15:33 the raw details. 12:15:33 12:15:33 .. note:: 12:15:33 12:15:33 More commonly, it's appropriate to use a convenience method 12:15:33 such as :meth:`request`. 12:15:33 12:15:33 .. note:: 12:15:33 12:15:33 `release_conn` will only behave as expected if 12:15:33 `preload_content=False` because we want to make 12:15:33 `preload_content=False` the default behaviour someday soon without 12:15:33 breaking backwards compatibility. 12:15:33 12:15:33 :param method: 12:15:33 HTTP request method (such as GET, POST, PUT, etc.) 12:15:33 12:15:33 :param url: 12:15:33 The URL to perform the request on. 12:15:33 12:15:33 :param body: 12:15:33 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:15:33 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:15:33 12:15:33 :param headers: 12:15:33 Dictionary of custom headers to send, such as User-Agent, 12:15:33 If-None-Match, etc. If None, pool headers are used. If provided, 12:15:33 these headers completely replace any pool-specific headers. 12:15:33 12:15:33 :param retries: 12:15:33 Configure the number of retries to allow before raising a 12:15:33 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:15:33 12:15:33 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:15:33 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:15:33 over different types of retries. 12:15:33 Pass an integer number to retry connection errors that many times, 12:15:33 but no other types of errors. Pass zero to never retry. 12:15:33 12:15:33 If ``False``, then retries are disabled and any exception is raised 12:15:33 immediately. Also, instead of raising a MaxRetryError on redirects, 12:15:33 the redirect response will be returned. 12:15:33 12:15:33 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:15:33 12:15:33 :param redirect: 12:15:33 If True, automatically handle redirects (status codes 301, 302, 12:15:33 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:15:33 will disable redirect, too. 12:15:33 12:15:33 :param assert_same_host: 12:15:33 If ``True``, will make sure that the host of the pool requests is 12:15:33 consistent else will raise HostChangedError. When ``False``, you can 12:15:33 use the pool on an HTTP proxy and request foreign hosts. 12:15:33 12:15:33 :param timeout: 12:15:33 If specified, overrides the default timeout for this one 12:15:33 request. It may be a float (in seconds) or an instance of 12:15:33 :class:`urllib3.util.Timeout`. 12:15:33 12:15:33 :param pool_timeout: 12:15:33 If set and the pool is set to block=True, then this method will 12:15:33 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:15:33 connection is available within the time period. 12:15:33 12:15:33 :param bool preload_content: 12:15:33 If True, the response's body will be preloaded into memory. 12:15:33 12:15:33 :param bool decode_content: 12:15:33 If True, will attempt to decode the body based on the 12:15:33 'content-encoding' header. 12:15:33 12:15:33 :param release_conn: 12:15:33 If False, then the urlopen call will not release the connection 12:15:33 back into the pool once a response is received (but will release if 12:15:33 you read the entire contents of the response such as when 12:15:33 `preload_content=True`). This is useful if you're not preloading 12:15:33 the response's content immediately. You will need to call 12:15:33 ``r.release_conn()`` on the response ``r`` to return the connection 12:15:33 back into the pool. If None, it takes the value of ``preload_content`` 12:15:33 which defaults to ``True``. 12:15:33 12:15:33 :param bool chunked: 12:15:33 If True, urllib3 will send the body using chunked transfer 12:15:33 encoding. Otherwise, urllib3 will send the body using the standard 12:15:33 content-length form. Defaults to False. 12:15:33 12:15:33 :param int body_pos: 12:15:33 Position to seek to in file-like body in the event of a retry or 12:15:33 redirect. Typically this won't need to be set because urllib3 will 12:15:33 auto-populate the value when needed. 12:15:33 """ 12:15:33 parsed_url = parse_url(url) 12:15:33 destination_scheme = parsed_url.scheme 12:15:33 12:15:33 if headers is None: 12:15:33 headers = self.headers 12:15:33 12:15:33 if not isinstance(retries, Retry): 12:15:33 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:15:33 12:15:33 if release_conn is None: 12:15:33 release_conn = preload_content 12:15:33 12:15:33 # Check host 12:15:33 if assert_same_host and not self.is_same_host(url): 12:15:33 raise HostChangedError(self, url, retries) 12:15:33 12:15:33 # Ensure that the URL we're connecting to is properly encoded 12:15:33 if url.startswith("/"): 12:15:33 url = to_str(_encode_target(url)) 12:15:33 else: 12:15:33 url = to_str(parsed_url.url) 12:15:33 12:15:33 conn = None 12:15:33 12:15:33 # Track whether `conn` needs to be released before 12:15:33 # returning/raising/recursing. Update this variable if necessary, and 12:15:33 # leave `release_conn` constant throughout the function. That way, if 12:15:33 # the function recurses, the original value of `release_conn` will be 12:15:33 # passed down into the recursive call, and its value will be respected. 12:15:33 # 12:15:33 # See issue #651 [1] for details. 12:15:33 # 12:15:33 # [1] 12:15:33 release_this_conn = release_conn 12:15:33 12:15:33 http_tunnel_required = connection_requires_http_tunnel( 12:15:33 self.proxy, self.proxy_config, destination_scheme 12:15:33 ) 12:15:33 12:15:33 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:15:33 # have to copy the headers dict so we can safely change it without those 12:15:33 # changes being reflected in anyone else's copy. 12:15:33 if not http_tunnel_required: 12:15:33 headers = headers.copy() # type: ignore[attr-defined] 12:15:33 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:15:33 12:15:33 # Must keep the exception bound to a separate variable or else Python 3 12:15:33 # complains about UnboundLocalError. 12:15:33 err = None 12:15:33 12:15:33 # Keep track of whether we cleanly exited the except block. This 12:15:33 # ensures we do proper cleanup in finally. 12:15:33 clean_exit = False 12:15:33 12:15:33 # Rewind body position, if needed. Record current position 12:15:33 # for future rewinds in the event of a redirect/retry. 12:15:33 body_pos = set_file_position(body, body_pos) 12:15:33 12:15:33 try: 12:15:33 # Request a connection from the queue. 12:15:33 timeout_obj = self._get_timeout(timeout) 12:15:33 conn = self._get_conn(timeout=pool_timeout) 12:15:33 12:15:33 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:15:33 12:15:33 # Is this a closed/new connection that requires CONNECT tunnelling? 12:15:33 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:15:33 try: 12:15:33 self._prepare_proxy(conn) 12:15:33 except (BaseSSLError, OSError, SocketTimeout) as e: 12:15:33 self._raise_timeout( 12:15:33 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:15:33 ) 12:15:33 raise 12:15:33 12:15:33 # If we're going to release the connection in ``finally:``, then 12:15:33 # the response doesn't need to know about the connection. Otherwise 12:15:33 # it will also try to release it and we'll have a double-release 12:15:33 # mess. 12:15:33 response_conn = conn if not release_conn else None 12:15:33 12:15:33 # Make the request on the HTTPConnection object 12:15:33 > response = self._make_request( 12:15:33 conn, 12:15:33 method, 12:15:33 url, 12:15:33 timeout=timeout_obj, 12:15:33 body=body, 12:15:33 headers=headers, 12:15:33 chunked=chunked, 12:15:33 retries=retries, 12:15:33 response_conn=response_conn, 12:15:33 preload_content=preload_content, 12:15:33 decode_content=decode_content, 12:15:33 **response_kw, 12:15:33 ) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 12:15:33 conn.request( 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 12:15:33 self.endheaders() 12:15:33 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 12:15:33 self._send_output(message_body, encode_chunked=encode_chunked) 12:15:33 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 12:15:33 self.send(msg) 12:15:33 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 12:15:33 self.connect() 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 12:15:33 self.sock = self._new_conn() 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 self = 12:15:33 12:15:33 def _new_conn(self) -> socket.socket: 12:15:33 """Establish a socket connection and set nodelay settings on it. 12:15:33 12:15:33 :return: New socket connection. 12:15:33 """ 12:15:33 try: 12:15:33 sock = connection.create_connection( 12:15:33 (self._dns_host, self.port), 12:15:33 self.timeout, 12:15:33 source_address=self.source_address, 12:15:33 socket_options=self.socket_options, 12:15:33 ) 12:15:33 except socket.gaierror as e: 12:15:33 raise NameResolutionError(self.host, self, e) from e 12:15:33 except SocketTimeout as e: 12:15:33 raise ConnectTimeoutError( 12:15:33 self, 12:15:33 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:15:33 ) from e 12:15:33 12:15:33 except OSError as e: 12:15:33 > raise NewConnectionError( 12:15:33 self, f"Failed to establish a new connection: {e}" 12:15:33 ) from e 12:15:33 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 12:15:33 12:15:33 The above exception was the direct cause of the following exception: 12:15:33 12:15:33 self = 12:15:33 request = , stream = False 12:15:33 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:33 proxies = OrderedDict() 12:15:33 12:15:33 def send( 12:15:33 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:33 ): 12:15:33 """Sends PreparedRequest object. Returns Response object. 12:15:33 12:15:33 :param request: The :class:`PreparedRequest ` being sent. 12:15:33 :param stream: (optional) Whether to stream the request content. 12:15:33 :param timeout: (optional) How long to wait for the server to send 12:15:33 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:33 read timeout) ` tuple. 12:15:33 :type timeout: float or tuple or urllib3 Timeout object 12:15:33 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:33 we verify the server's TLS certificate, or a string, in which case it 12:15:33 must be a path to a CA bundle to use 12:15:33 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:33 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:33 :rtype: requests.Response 12:15:33 """ 12:15:33 12:15:33 try: 12:15:33 conn = self.get_connection_with_tls_context( 12:15:33 request, verify, proxies=proxies, cert=cert 12:15:33 ) 12:15:33 except LocationValueError as e: 12:15:33 raise InvalidURL(e, request=request) 12:15:33 12:15:33 self.cert_verify(conn, request.url, verify, cert) 12:15:33 url = self.request_url(request, proxies) 12:15:33 self.add_headers( 12:15:33 request, 12:15:33 stream=stream, 12:15:33 timeout=timeout, 12:15:33 verify=verify, 12:15:33 cert=cert, 12:15:33 proxies=proxies, 12:15:33 ) 12:15:33 12:15:33 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:33 12:15:33 if isinstance(timeout, tuple): 12:15:33 try: 12:15:33 connect, read = timeout 12:15:33 timeout = TimeoutSauce(connect=connect, read=read) 12:15:33 except ValueError: 12:15:33 raise ValueError( 12:15:33 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:33 f"or a single float to set both timeouts to the same value." 12:15:33 ) 12:15:33 elif isinstance(timeout, TimeoutSauce): 12:15:33 pass 12:15:33 else: 12:15:33 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:33 12:15:33 try: 12:15:33 > resp = conn.urlopen( 12:15:33 method=request.method, 12:15:33 url=url, 12:15:33 body=request.body, 12:15:33 headers=request.headers, 12:15:33 redirect=False, 12:15:33 assert_same_host=False, 12:15:33 preload_content=False, 12:15:33 decode_content=False, 12:15:33 retries=self.max_retries, 12:15:33 timeout=timeout, 12:15:33 chunked=chunked, 12:15:33 ) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 12:15:33 retries = retries.increment( 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:33 method = 'DELETE' 12:15:33 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01' 12:15:33 response = None 12:15:33 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 12:15:33 _pool = 12:15:33 _stacktrace = 12:15:33 12:15:33 def increment( 12:15:33 self, 12:15:33 method: str | None = None, 12:15:33 url: str | None = None, 12:15:33 response: BaseHTTPResponse | None = None, 12:15:33 error: Exception | None = None, 12:15:33 _pool: ConnectionPool | None = None, 12:15:33 _stacktrace: TracebackType | None = None, 12:15:33 ) -> Self: 12:15:33 """Return a new Retry object with incremented retry counters. 12:15:33 12:15:33 :param response: A response object, or None, if the server did not 12:15:33 return a response. 12:15:33 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:15:33 :param Exception error: An error encountered during the request, or 12:15:33 None if the response was received successfully. 12:15:33 12:15:33 :return: A new ``Retry`` object. 12:15:33 """ 12:15:33 if self.total is False and error: 12:15:33 # Disabled, indicate to re-raise the error. 12:15:33 raise reraise(type(error), error, _stacktrace) 12:15:33 12:15:33 total = self.total 12:15:33 if total is not None: 12:15:33 total -= 1 12:15:33 12:15:33 connect = self.connect 12:15:33 read = self.read 12:15:33 redirect = self.redirect 12:15:33 status_count = self.status 12:15:33 other = self.other 12:15:33 cause = "unknown" 12:15:33 status = None 12:15:33 redirect_location = None 12:15:33 12:15:33 if error and self._is_connection_error(error): 12:15:33 # Connect retry? 12:15:33 if connect is False: 12:15:33 raise reraise(type(error), error, _stacktrace) 12:15:33 elif connect is not None: 12:15:33 connect -= 1 12:15:33 12:15:33 elif error and self._is_read_error(error): 12:15:33 # Read retry? 12:15:33 if read is False or method is None or not self._is_method_retryable(method): 12:15:33 raise reraise(type(error), error, _stacktrace) 12:15:33 elif read is not None: 12:15:33 read -= 1 12:15:33 12:15:33 elif error: 12:15:33 # Other retry? 12:15:33 if other is not None: 12:15:33 other -= 1 12:15:33 12:15:33 elif response and response.get_redirect_location(): 12:15:33 # Redirect retry? 12:15:33 if redirect is not None: 12:15:33 redirect -= 1 12:15:33 cause = "too many redirects" 12:15:33 response_redirect_location = response.get_redirect_location() 12:15:33 if response_redirect_location: 12:15:33 redirect_location = response_redirect_location 12:15:33 status = response.status 12:15:33 12:15:33 else: 12:15:33 # Incrementing because of a server error like a 500 in 12:15:33 # status_forcelist and the given method is in the allowed_methods 12:15:33 cause = ResponseError.GENERIC_ERROR 12:15:33 if response and response.status: 12:15:33 if status_count is not None: 12:15:33 status_count -= 1 12:15:33 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:15:33 status = response.status 12:15:33 12:15:33 history = self.history + ( 12:15:33 RequestHistory(method, url, error, status, redirect_location), 12:15:33 ) 12:15:33 12:15:33 new_retry = self.new( 12:15:33 total=total, 12:15:33 connect=connect, 12:15:33 read=read, 12:15:33 redirect=redirect, 12:15:33 status=status_count, 12:15:33 other=other, 12:15:33 history=history, 12:15:33 ) 12:15:33 12:15:33 if new_retry.is_exhausted(): 12:15:33 reason = error or ResponseError(cause) 12:15:33 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:15:33 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 12:15:33 12:15:33 During handling of the above exception, another exception occurred: 12:15:33 12:15:33 self = 12:15:33 12:15:33 def test_19_rdm_device_disconnection(self): 12:15:33 > response = test_utils.unmount_device("ROADMA01") 12:15:33 12:15:33 transportpce_tests/1.2.1/test01_portmapping.py:211: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 transportpce_tests/common/test_utils.py:358: in unmount_device 12:15:33 response = delete_request(url[RESTCONF_VERSION].format('{}', node)) 12:15:33 transportpce_tests/common/test_utils.py:133: in delete_request 12:15:33 return requests.request( 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:15:33 return session.request(method=method, url=url, **kwargs) 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:15:33 resp = self.send(prep, **send_kwargs) 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:15:33 r = adapter.send(request, **kwargs) 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 self = 12:15:33 request = , stream = False 12:15:33 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:33 proxies = OrderedDict() 12:15:33 12:15:33 def send( 12:15:33 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:33 ): 12:15:33 """Sends PreparedRequest object. Returns Response object. 12:15:33 12:15:33 :param request: The :class:`PreparedRequest ` being sent. 12:15:33 :param stream: (optional) Whether to stream the request content. 12:15:33 :param timeout: (optional) How long to wait for the server to send 12:15:33 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:33 read timeout) ` tuple. 12:15:33 :type timeout: float or tuple or urllib3 Timeout object 12:15:33 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:33 we verify the server's TLS certificate, or a string, in which case it 12:15:33 must be a path to a CA bundle to use 12:15:33 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:33 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:33 :rtype: requests.Response 12:15:33 """ 12:15:33 12:15:33 try: 12:15:33 conn = self.get_connection_with_tls_context( 12:15:33 request, verify, proxies=proxies, cert=cert 12:15:33 ) 12:15:33 except LocationValueError as e: 12:15:33 raise InvalidURL(e, request=request) 12:15:33 12:15:33 self.cert_verify(conn, request.url, verify, cert) 12:15:33 url = self.request_url(request, proxies) 12:15:33 self.add_headers( 12:15:33 request, 12:15:33 stream=stream, 12:15:33 timeout=timeout, 12:15:33 verify=verify, 12:15:33 cert=cert, 12:15:33 proxies=proxies, 12:15:33 ) 12:15:33 12:15:33 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:33 12:15:33 if isinstance(timeout, tuple): 12:15:33 try: 12:15:33 connect, read = timeout 12:15:33 timeout = TimeoutSauce(connect=connect, read=read) 12:15:33 except ValueError: 12:15:33 raise ValueError( 12:15:33 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:33 f"or a single float to set both timeouts to the same value." 12:15:33 ) 12:15:33 elif isinstance(timeout, TimeoutSauce): 12:15:33 pass 12:15:33 else: 12:15:33 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:33 12:15:33 try: 12:15:33 resp = conn.urlopen( 12:15:33 method=request.method, 12:15:33 url=url, 12:15:33 body=request.body, 12:15:33 headers=request.headers, 12:15:33 redirect=False, 12:15:33 assert_same_host=False, 12:15:33 preload_content=False, 12:15:33 decode_content=False, 12:15:33 retries=self.max_retries, 12:15:33 timeout=timeout, 12:15:33 chunked=chunked, 12:15:33 ) 12:15:33 12:15:33 except (ProtocolError, OSError) as err: 12:15:33 raise ConnectionError(err, request=request) 12:15:33 12:15:33 except MaxRetryError as e: 12:15:33 if isinstance(e.reason, ConnectTimeoutError): 12:15:33 # TODO: Remove this in 3.0.0: see #2811 12:15:33 if not isinstance(e.reason, NewConnectionError): 12:15:33 raise ConnectTimeout(e, request=request) 12:15:33 12:15:33 if isinstance(e.reason, ResponseError): 12:15:33 raise RetryError(e, request=request) 12:15:33 12:15:33 if isinstance(e.reason, _ProxyError): 12:15:33 raise ProxyError(e, request=request) 12:15:33 12:15:33 if isinstance(e.reason, _SSLError): 12:15:33 # This branch is for urllib3 v1.22 and later. 12:15:33 raise SSLError(e, request=request) 12:15:33 12:15:33 > raise ConnectionError(e, request=request) 12:15:33 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 12:15:33 ----------------------------- Captured stdout call ----------------------------- 12:15:33 execution of test_19_rdm_device_disconnection 12:15:33 ________ TransportPCEPortMappingTesting.test_20_rdm_device_disconnected ________ 12:15:33 12:15:33 self = 12:15:33 12:15:33 def _new_conn(self) -> socket.socket: 12:15:33 """Establish a socket connection and set nodelay settings on it. 12:15:33 12:15:33 :return: New socket connection. 12:15:33 """ 12:15:33 try: 12:15:33 > sock = connection.create_connection( 12:15:33 (self._dns_host, self.port), 12:15:33 self.timeout, 12:15:33 source_address=self.source_address, 12:15:33 socket_options=self.socket_options, 12:15:33 ) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:15:33 raise err 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 address = ('localhost', 8182), timeout = 10, source_address = None 12:15:33 socket_options = [(6, 1, 1)] 12:15:33 12:15:33 def create_connection( 12:15:33 address: tuple[str, int], 12:15:33 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:33 source_address: tuple[str, int] | None = None, 12:15:33 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:15:33 ) -> socket.socket: 12:15:33 """Connect to *address* and return the socket object. 12:15:33 12:15:33 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:15:33 port)``) and return the socket object. Passing the optional 12:15:33 *timeout* parameter will set the timeout on the socket instance 12:15:33 before attempting to connect. If no *timeout* is supplied, the 12:15:33 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:15:33 is used. If *source_address* is set it must be a tuple of (host, port) 12:15:33 for the socket to bind as a source address before making the connection. 12:15:33 An host of '' or port 0 tells the OS to use the default. 12:15:33 """ 12:15:33 12:15:33 host, port = address 12:15:33 if host.startswith("["): 12:15:33 host = host.strip("[]") 12:15:33 err = None 12:15:33 12:15:33 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:15:33 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:15:33 # The original create_connection function always returns all records. 12:15:33 family = allowed_gai_family() 12:15:33 12:15:33 try: 12:15:33 host.encode("idna") 12:15:33 except UnicodeError: 12:15:33 raise LocationParseError(f"'{host}', label empty or too long") from None 12:15:33 12:15:33 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:15:33 af, socktype, proto, canonname, sa = res 12:15:33 sock = None 12:15:33 try: 12:15:33 sock = socket.socket(af, socktype, proto) 12:15:33 12:15:33 # If provided, set socket level options before connecting. 12:15:33 _set_socket_options(sock, socket_options) 12:15:33 12:15:33 if timeout is not _DEFAULT_TIMEOUT: 12:15:33 sock.settimeout(timeout) 12:15:33 if source_address: 12:15:33 sock.bind(source_address) 12:15:33 > sock.connect(sa) 12:15:33 E ConnectionRefusedError: [Errno 111] Connection refused 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:15:33 12:15:33 The above exception was the direct cause of the following exception: 12:15:33 12:15:33 self = 12:15:33 method = 'GET' 12:15:33 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01?content=nonconfig' 12:15:33 body = None 12:15:33 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:15:33 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:33 redirect = False, assert_same_host = False 12:15:33 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 12:15:33 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:15:33 decode_content = False, response_kw = {} 12:15:33 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01', query='content=nonconfig', fragment=None) 12:15:33 destination_scheme = None, conn = None, release_this_conn = True 12:15:33 http_tunnel_required = False, err = None, clean_exit = False 12:15:33 12:15:33 def urlopen( # type: ignore[override] 12:15:33 self, 12:15:33 method: str, 12:15:33 url: str, 12:15:33 body: _TYPE_BODY | None = None, 12:15:33 headers: typing.Mapping[str, str] | None = None, 12:15:33 retries: Retry | bool | int | None = None, 12:15:33 redirect: bool = True, 12:15:33 assert_same_host: bool = True, 12:15:33 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:33 pool_timeout: int | None = None, 12:15:33 release_conn: bool | None = None, 12:15:33 chunked: bool = False, 12:15:33 body_pos: _TYPE_BODY_POSITION | None = None, 12:15:33 preload_content: bool = True, 12:15:33 decode_content: bool = True, 12:15:33 **response_kw: typing.Any, 12:15:33 ) -> BaseHTTPResponse: 12:15:33 """ 12:15:33 Get a connection from the pool and perform an HTTP request. This is the 12:15:33 lowest level call for making a request, so you'll need to specify all 12:15:33 the raw details. 12:15:33 12:15:33 .. note:: 12:15:33 12:15:33 More commonly, it's appropriate to use a convenience method 12:15:33 such as :meth:`request`. 12:15:33 12:15:33 .. note:: 12:15:33 12:15:33 `release_conn` will only behave as expected if 12:15:33 `preload_content=False` because we want to make 12:15:33 `preload_content=False` the default behaviour someday soon without 12:15:33 breaking backwards compatibility. 12:15:33 12:15:33 :param method: 12:15:33 HTTP request method (such as GET, POST, PUT, etc.) 12:15:33 12:15:33 :param url: 12:15:33 The URL to perform the request on. 12:15:33 12:15:33 :param body: 12:15:33 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:15:33 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:15:33 12:15:33 :param headers: 12:15:33 Dictionary of custom headers to send, such as User-Agent, 12:15:33 If-None-Match, etc. If None, pool headers are used. If provided, 12:15:33 these headers completely replace any pool-specific headers. 12:15:33 12:15:33 :param retries: 12:15:33 Configure the number of retries to allow before raising a 12:15:33 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:15:33 12:15:33 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:15:33 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:15:33 over different types of retries. 12:15:33 Pass an integer number to retry connection errors that many times, 12:15:33 but no other types of errors. Pass zero to never retry. 12:15:33 12:15:33 If ``False``, then retries are disabled and any exception is raised 12:15:33 immediately. Also, instead of raising a MaxRetryError on redirects, 12:15:33 the redirect response will be returned. 12:15:33 12:15:33 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:15:33 12:15:33 :param redirect: 12:15:33 If True, automatically handle redirects (status codes 301, 302, 12:15:33 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:15:33 will disable redirect, too. 12:15:33 12:15:33 :param assert_same_host: 12:15:33 If ``True``, will make sure that the host of the pool requests is 12:15:33 consistent else will raise HostChangedError. When ``False``, you can 12:15:33 use the pool on an HTTP proxy and request foreign hosts. 12:15:33 12:15:33 :param timeout: 12:15:33 If specified, overrides the default timeout for this one 12:15:33 request. It may be a float (in seconds) or an instance of 12:15:33 :class:`urllib3.util.Timeout`. 12:15:33 12:15:33 :param pool_timeout: 12:15:33 If set and the pool is set to block=True, then this method will 12:15:33 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:15:33 connection is available within the time period. 12:15:33 12:15:33 :param bool preload_content: 12:15:33 If True, the response's body will be preloaded into memory. 12:15:33 12:15:33 :param bool decode_content: 12:15:33 If True, will attempt to decode the body based on the 12:15:33 'content-encoding' header. 12:15:33 12:15:33 :param release_conn: 12:15:33 If False, then the urlopen call will not release the connection 12:15:33 back into the pool once a response is received (but will release if 12:15:33 you read the entire contents of the response such as when 12:15:33 `preload_content=True`). This is useful if you're not preloading 12:15:33 the response's content immediately. You will need to call 12:15:33 ``r.release_conn()`` on the response ``r`` to return the connection 12:15:33 back into the pool. If None, it takes the value of ``preload_content`` 12:15:33 which defaults to ``True``. 12:15:33 12:15:33 :param bool chunked: 12:15:33 If True, urllib3 will send the body using chunked transfer 12:15:33 encoding. Otherwise, urllib3 will send the body using the standard 12:15:33 content-length form. Defaults to False. 12:15:33 12:15:33 :param int body_pos: 12:15:33 Position to seek to in file-like body in the event of a retry or 12:15:33 redirect. Typically this won't need to be set because urllib3 will 12:15:33 auto-populate the value when needed. 12:15:33 """ 12:15:33 parsed_url = parse_url(url) 12:15:33 destination_scheme = parsed_url.scheme 12:15:33 12:15:33 if headers is None: 12:15:33 headers = self.headers 12:15:33 12:15:33 if not isinstance(retries, Retry): 12:15:33 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:15:33 12:15:33 if release_conn is None: 12:15:33 release_conn = preload_content 12:15:33 12:15:33 # Check host 12:15:33 if assert_same_host and not self.is_same_host(url): 12:15:33 raise HostChangedError(self, url, retries) 12:15:33 12:15:33 # Ensure that the URL we're connecting to is properly encoded 12:15:33 if url.startswith("/"): 12:15:33 url = to_str(_encode_target(url)) 12:15:33 else: 12:15:33 url = to_str(parsed_url.url) 12:15:33 12:15:33 conn = None 12:15:33 12:15:33 # Track whether `conn` needs to be released before 12:15:33 # returning/raising/recursing. Update this variable if necessary, and 12:15:33 # leave `release_conn` constant throughout the function. That way, if 12:15:33 # the function recurses, the original value of `release_conn` will be 12:15:33 # passed down into the recursive call, and its value will be respected. 12:15:33 # 12:15:33 # See issue #651 [1] for details. 12:15:33 # 12:15:33 # [1] 12:15:33 release_this_conn = release_conn 12:15:33 12:15:33 http_tunnel_required = connection_requires_http_tunnel( 12:15:33 self.proxy, self.proxy_config, destination_scheme 12:15:33 ) 12:15:33 12:15:33 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:15:33 # have to copy the headers dict so we can safely change it without those 12:15:33 # changes being reflected in anyone else's copy. 12:15:33 if not http_tunnel_required: 12:15:33 headers = headers.copy() # type: ignore[attr-defined] 12:15:33 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:15:33 12:15:33 # Must keep the exception bound to a separate variable or else Python 3 12:15:33 # complains about UnboundLocalError. 12:15:33 err = None 12:15:33 12:15:33 # Keep track of whether we cleanly exited the except block. This 12:15:33 # ensures we do proper cleanup in finally. 12:15:33 clean_exit = False 12:15:33 12:15:33 # Rewind body position, if needed. Record current position 12:15:33 # for future rewinds in the event of a redirect/retry. 12:15:33 body_pos = set_file_position(body, body_pos) 12:15:33 12:15:33 try: 12:15:33 # Request a connection from the queue. 12:15:33 timeout_obj = self._get_timeout(timeout) 12:15:33 conn = self._get_conn(timeout=pool_timeout) 12:15:33 12:15:33 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:15:33 12:15:33 # Is this a closed/new connection that requires CONNECT tunnelling? 12:15:33 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:15:33 try: 12:15:33 self._prepare_proxy(conn) 12:15:33 except (BaseSSLError, OSError, SocketTimeout) as e: 12:15:33 self._raise_timeout( 12:15:33 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:15:33 ) 12:15:33 raise 12:15:33 12:15:33 # If we're going to release the connection in ``finally:``, then 12:15:33 # the response doesn't need to know about the connection. Otherwise 12:15:33 # it will also try to release it and we'll have a double-release 12:15:33 # mess. 12:15:33 response_conn = conn if not release_conn else None 12:15:33 12:15:33 # Make the request on the HTTPConnection object 12:15:33 > response = self._make_request( 12:15:33 conn, 12:15:33 method, 12:15:33 url, 12:15:33 timeout=timeout_obj, 12:15:33 body=body, 12:15:33 headers=headers, 12:15:33 chunked=chunked, 12:15:33 retries=retries, 12:15:33 response_conn=response_conn, 12:15:33 preload_content=preload_content, 12:15:33 decode_content=decode_content, 12:15:33 **response_kw, 12:15:33 ) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 12:15:33 conn.request( 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 12:15:33 self.endheaders() 12:15:33 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 12:15:33 self._send_output(message_body, encode_chunked=encode_chunked) 12:15:33 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 12:15:33 self.send(msg) 12:15:33 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 12:15:33 self.connect() 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 12:15:33 self.sock = self._new_conn() 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 self = 12:15:33 12:15:33 def _new_conn(self) -> socket.socket: 12:15:33 """Establish a socket connection and set nodelay settings on it. 12:15:33 12:15:33 :return: New socket connection. 12:15:33 """ 12:15:33 try: 12:15:33 sock = connection.create_connection( 12:15:33 (self._dns_host, self.port), 12:15:33 self.timeout, 12:15:33 source_address=self.source_address, 12:15:33 socket_options=self.socket_options, 12:15:33 ) 12:15:33 except socket.gaierror as e: 12:15:33 raise NameResolutionError(self.host, self, e) from e 12:15:33 except SocketTimeout as e: 12:15:33 raise ConnectTimeoutError( 12:15:33 self, 12:15:33 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:15:33 ) from e 12:15:33 12:15:33 except OSError as e: 12:15:33 > raise NewConnectionError( 12:15:33 self, f"Failed to establish a new connection: {e}" 12:15:33 ) from e 12:15:33 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 12:15:33 12:15:33 The above exception was the direct cause of the following exception: 12:15:33 12:15:33 self = 12:15:33 request = , stream = False 12:15:33 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:33 proxies = OrderedDict() 12:15:33 12:15:33 def send( 12:15:33 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:33 ): 12:15:33 """Sends PreparedRequest object. Returns Response object. 12:15:33 12:15:33 :param request: The :class:`PreparedRequest ` being sent. 12:15:33 :param stream: (optional) Whether to stream the request content. 12:15:33 :param timeout: (optional) How long to wait for the server to send 12:15:33 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:33 read timeout) ` tuple. 12:15:33 :type timeout: float or tuple or urllib3 Timeout object 12:15:33 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:33 we verify the server's TLS certificate, or a string, in which case it 12:15:33 must be a path to a CA bundle to use 12:15:33 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:33 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:33 :rtype: requests.Response 12:15:33 """ 12:15:33 12:15:33 try: 12:15:33 conn = self.get_connection_with_tls_context( 12:15:33 request, verify, proxies=proxies, cert=cert 12:15:33 ) 12:15:33 except LocationValueError as e: 12:15:33 raise InvalidURL(e, request=request) 12:15:33 12:15:33 self.cert_verify(conn, request.url, verify, cert) 12:15:33 url = self.request_url(request, proxies) 12:15:33 self.add_headers( 12:15:33 request, 12:15:33 stream=stream, 12:15:33 timeout=timeout, 12:15:33 verify=verify, 12:15:33 cert=cert, 12:15:33 proxies=proxies, 12:15:33 ) 12:15:33 12:15:33 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:33 12:15:33 if isinstance(timeout, tuple): 12:15:33 try: 12:15:33 connect, read = timeout 12:15:33 timeout = TimeoutSauce(connect=connect, read=read) 12:15:33 except ValueError: 12:15:33 raise ValueError( 12:15:33 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:33 f"or a single float to set both timeouts to the same value." 12:15:33 ) 12:15:33 elif isinstance(timeout, TimeoutSauce): 12:15:33 pass 12:15:33 else: 12:15:33 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:33 12:15:33 try: 12:15:33 > resp = conn.urlopen( 12:15:33 method=request.method, 12:15:33 url=url, 12:15:33 body=request.body, 12:15:33 headers=request.headers, 12:15:33 redirect=False, 12:15:33 assert_same_host=False, 12:15:33 preload_content=False, 12:15:33 decode_content=False, 12:15:33 retries=self.max_retries, 12:15:33 timeout=timeout, 12:15:33 chunked=chunked, 12:15:33 ) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 12:15:33 retries = retries.increment( 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:33 method = 'GET' 12:15:33 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01?content=nonconfig' 12:15:33 response = None 12:15:33 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 12:15:33 _pool = 12:15:33 _stacktrace = 12:15:33 12:15:33 def increment( 12:15:33 self, 12:15:33 method: str | None = None, 12:15:33 url: str | None = None, 12:15:33 response: BaseHTTPResponse | None = None, 12:15:33 error: Exception | None = None, 12:15:33 _pool: ConnectionPool | None = None, 12:15:33 _stacktrace: TracebackType | None = None, 12:15:33 ) -> Self: 12:15:33 """Return a new Retry object with incremented retry counters. 12:15:33 12:15:33 :param response: A response object, or None, if the server did not 12:15:33 return a response. 12:15:33 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:15:33 :param Exception error: An error encountered during the request, or 12:15:33 None if the response was received successfully. 12:15:33 12:15:33 :return: A new ``Retry`` object. 12:15:33 """ 12:15:33 if self.total is False and error: 12:15:33 # Disabled, indicate to re-raise the error. 12:15:33 raise reraise(type(error), error, _stacktrace) 12:15:33 12:15:33 total = self.total 12:15:33 if total is not None: 12:15:33 total -= 1 12:15:33 12:15:33 connect = self.connect 12:15:33 read = self.read 12:15:33 redirect = self.redirect 12:15:33 status_count = self.status 12:15:33 other = self.other 12:15:33 cause = "unknown" 12:15:33 status = None 12:15:33 redirect_location = None 12:15:33 12:15:33 if error and self._is_connection_error(error): 12:15:33 # Connect retry? 12:15:33 if connect is False: 12:15:33 raise reraise(type(error), error, _stacktrace) 12:15:33 elif connect is not None: 12:15:33 connect -= 1 12:15:33 12:15:33 elif error and self._is_read_error(error): 12:15:33 # Read retry? 12:15:33 if read is False or method is None or not self._is_method_retryable(method): 12:15:33 raise reraise(type(error), error, _stacktrace) 12:15:33 elif read is not None: 12:15:33 read -= 1 12:15:33 12:15:33 elif error: 12:15:33 # Other retry? 12:15:33 if other is not None: 12:15:33 other -= 1 12:15:33 12:15:33 elif response and response.get_redirect_location(): 12:15:33 # Redirect retry? 12:15:33 if redirect is not None: 12:15:33 redirect -= 1 12:15:33 cause = "too many redirects" 12:15:33 response_redirect_location = response.get_redirect_location() 12:15:33 if response_redirect_location: 12:15:33 redirect_location = response_redirect_location 12:15:33 status = response.status 12:15:33 12:15:33 else: 12:15:33 # Incrementing because of a server error like a 500 in 12:15:33 # status_forcelist and the given method is in the allowed_methods 12:15:33 cause = ResponseError.GENERIC_ERROR 12:15:33 if response and response.status: 12:15:33 if status_count is not None: 12:15:33 status_count -= 1 12:15:33 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:15:33 status = response.status 12:15:33 12:15:33 history = self.history + ( 12:15:33 RequestHistory(method, url, error, status, redirect_location), 12:15:33 ) 12:15:33 12:15:33 new_retry = self.new( 12:15:33 total=total, 12:15:33 connect=connect, 12:15:33 read=read, 12:15:33 redirect=redirect, 12:15:33 status=status_count, 12:15:33 other=other, 12:15:33 history=history, 12:15:33 ) 12:15:33 12:15:33 if new_retry.is_exhausted(): 12:15:33 reason = error or ResponseError(cause) 12:15:33 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:15:33 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 12:15:33 12:15:33 During handling of the above exception, another exception occurred: 12:15:33 12:15:33 self = 12:15:33 12:15:33 def test_20_rdm_device_disconnected(self): 12:15:33 > response = test_utils.check_device_connection("ROADMA01") 12:15:33 12:15:33 transportpce_tests/1.2.1/test01_portmapping.py:215: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 transportpce_tests/common/test_utils.py:369: in check_device_connection 12:15:33 response = get_request(url[RESTCONF_VERSION].format('{}', node)) 12:15:33 transportpce_tests/common/test_utils.py:116: in get_request 12:15:33 return requests.request( 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:15:33 return session.request(method=method, url=url, **kwargs) 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:15:33 resp = self.send(prep, **send_kwargs) 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:15:33 r = adapter.send(request, **kwargs) 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 self = 12:15:33 request = , stream = False 12:15:33 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:33 proxies = OrderedDict() 12:15:33 12:15:33 def send( 12:15:33 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:33 ): 12:15:33 """Sends PreparedRequest object. Returns Response object. 12:15:33 12:15:33 :param request: The :class:`PreparedRequest ` being sent. 12:15:33 :param stream: (optional) Whether to stream the request content. 12:15:33 :param timeout: (optional) How long to wait for the server to send 12:15:33 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:33 read timeout) ` tuple. 12:15:33 :type timeout: float or tuple or urllib3 Timeout object 12:15:33 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:33 we verify the server's TLS certificate, or a string, in which case it 12:15:33 must be a path to a CA bundle to use 12:15:33 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:33 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:33 :rtype: requests.Response 12:15:33 """ 12:15:33 12:15:33 try: 12:15:33 conn = self.get_connection_with_tls_context( 12:15:33 request, verify, proxies=proxies, cert=cert 12:15:33 ) 12:15:33 except LocationValueError as e: 12:15:33 raise InvalidURL(e, request=request) 12:15:33 12:15:33 self.cert_verify(conn, request.url, verify, cert) 12:15:33 url = self.request_url(request, proxies) 12:15:33 self.add_headers( 12:15:33 request, 12:15:33 stream=stream, 12:15:33 timeout=timeout, 12:15:33 verify=verify, 12:15:33 cert=cert, 12:15:33 proxies=proxies, 12:15:33 ) 12:15:33 12:15:33 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:33 12:15:33 if isinstance(timeout, tuple): 12:15:33 try: 12:15:33 connect, read = timeout 12:15:33 timeout = TimeoutSauce(connect=connect, read=read) 12:15:33 except ValueError: 12:15:33 raise ValueError( 12:15:33 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:33 f"or a single float to set both timeouts to the same value." 12:15:33 ) 12:15:33 elif isinstance(timeout, TimeoutSauce): 12:15:33 pass 12:15:33 else: 12:15:33 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:33 12:15:33 try: 12:15:33 resp = conn.urlopen( 12:15:33 method=request.method, 12:15:33 url=url, 12:15:33 body=request.body, 12:15:33 headers=request.headers, 12:15:33 redirect=False, 12:15:33 assert_same_host=False, 12:15:33 preload_content=False, 12:15:33 decode_content=False, 12:15:33 retries=self.max_retries, 12:15:33 timeout=timeout, 12:15:33 chunked=chunked, 12:15:33 ) 12:15:33 12:15:33 except (ProtocolError, OSError) as err: 12:15:33 raise ConnectionError(err, request=request) 12:15:33 12:15:33 except MaxRetryError as e: 12:15:33 if isinstance(e.reason, ConnectTimeoutError): 12:15:33 # TODO: Remove this in 3.0.0: see #2811 12:15:33 if not isinstance(e.reason, NewConnectionError): 12:15:33 raise ConnectTimeout(e, request=request) 12:15:33 12:15:33 if isinstance(e.reason, ResponseError): 12:15:33 raise RetryError(e, request=request) 12:15:33 12:15:33 if isinstance(e.reason, _ProxyError): 12:15:33 raise ProxyError(e, request=request) 12:15:33 12:15:33 if isinstance(e.reason, _SSLError): 12:15:33 # This branch is for urllib3 v1.22 and later. 12:15:33 raise SSLError(e, request=request) 12:15:33 12:15:33 > raise ConnectionError(e, request=request) 12:15:33 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 12:15:33 ----------------------------- Captured stdout call ----------------------------- 12:15:33 execution of test_20_rdm_device_disconnected 12:15:33 _______ TransportPCEPortMappingTesting.test_21_rdm_device_not_connected ________ 12:15:33 12:15:33 self = 12:15:33 12:15:33 def _new_conn(self) -> socket.socket: 12:15:33 """Establish a socket connection and set nodelay settings on it. 12:15:33 12:15:33 :return: New socket connection. 12:15:33 """ 12:15:33 try: 12:15:33 > sock = connection.create_connection( 12:15:33 (self._dns_host, self.port), 12:15:33 self.timeout, 12:15:33 source_address=self.source_address, 12:15:33 socket_options=self.socket_options, 12:15:33 ) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:15:33 raise err 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 address = ('localhost', 8182), timeout = 10, source_address = None 12:15:33 socket_options = [(6, 1, 1)] 12:15:33 12:15:33 def create_connection( 12:15:33 address: tuple[str, int], 12:15:33 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:33 source_address: tuple[str, int] | None = None, 12:15:33 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:15:33 ) -> socket.socket: 12:15:33 """Connect to *address* and return the socket object. 12:15:33 12:15:33 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:15:33 port)``) and return the socket object. Passing the optional 12:15:33 *timeout* parameter will set the timeout on the socket instance 12:15:33 before attempting to connect. If no *timeout* is supplied, the 12:15:33 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:15:33 is used. If *source_address* is set it must be a tuple of (host, port) 12:15:33 for the socket to bind as a source address before making the connection. 12:15:33 An host of '' or port 0 tells the OS to use the default. 12:15:33 """ 12:15:33 12:15:33 host, port = address 12:15:33 if host.startswith("["): 12:15:33 host = host.strip("[]") 12:15:33 err = None 12:15:33 12:15:33 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:15:33 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:15:33 # The original create_connection function always returns all records. 12:15:33 family = allowed_gai_family() 12:15:33 12:15:33 try: 12:15:33 host.encode("idna") 12:15:33 except UnicodeError: 12:15:33 raise LocationParseError(f"'{host}', label empty or too long") from None 12:15:33 12:15:33 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:15:33 af, socktype, proto, canonname, sa = res 12:15:33 sock = None 12:15:33 try: 12:15:33 sock = socket.socket(af, socktype, proto) 12:15:33 12:15:33 # If provided, set socket level options before connecting. 12:15:33 _set_socket_options(sock, socket_options) 12:15:33 12:15:33 if timeout is not _DEFAULT_TIMEOUT: 12:15:33 sock.settimeout(timeout) 12:15:33 if source_address: 12:15:33 sock.bind(source_address) 12:15:33 > sock.connect(sa) 12:15:33 E ConnectionRefusedError: [Errno 111] Connection refused 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:15:33 12:15:33 The above exception was the direct cause of the following exception: 12:15:33 12:15:33 self = 12:15:33 method = 'GET' 12:15:33 url = '/rests/data/transportpce-portmapping:network/nodes=ROADMA01/node-info' 12:15:33 body = None 12:15:33 headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:15:33 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:33 redirect = False, assert_same_host = False 12:15:33 timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None 12:15:33 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:15:33 decode_content = False, response_kw = {} 12:15:33 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=ROADMA01/node-info', query=None, fragment=None) 12:15:33 destination_scheme = None, conn = None, release_this_conn = True 12:15:33 http_tunnel_required = False, err = None, clean_exit = False 12:15:33 12:15:33 def urlopen( # type: ignore[override] 12:15:33 self, 12:15:33 method: str, 12:15:33 url: str, 12:15:33 body: _TYPE_BODY | None = None, 12:15:33 headers: typing.Mapping[str, str] | None = None, 12:15:33 retries: Retry | bool | int | None = None, 12:15:33 redirect: bool = True, 12:15:33 assert_same_host: bool = True, 12:15:33 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:15:33 pool_timeout: int | None = None, 12:15:33 release_conn: bool | None = None, 12:15:33 chunked: bool = False, 12:15:33 body_pos: _TYPE_BODY_POSITION | None = None, 12:15:33 preload_content: bool = True, 12:15:33 decode_content: bool = True, 12:15:33 **response_kw: typing.Any, 12:15:33 ) -> BaseHTTPResponse: 12:15:33 """ 12:15:33 Get a connection from the pool and perform an HTTP request. This is the 12:15:33 lowest level call for making a request, so you'll need to specify all 12:15:33 the raw details. 12:15:33 12:15:33 .. note:: 12:15:33 12:15:33 More commonly, it's appropriate to use a convenience method 12:15:33 such as :meth:`request`. 12:15:33 12:15:33 .. note:: 12:15:33 12:15:33 `release_conn` will only behave as expected if 12:15:33 `preload_content=False` because we want to make 12:15:33 `preload_content=False` the default behaviour someday soon without 12:15:33 breaking backwards compatibility. 12:15:33 12:15:33 :param method: 12:15:33 HTTP request method (such as GET, POST, PUT, etc.) 12:15:33 12:15:33 :param url: 12:15:33 The URL to perform the request on. 12:15:33 12:15:33 :param body: 12:15:33 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:15:33 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:15:33 12:15:33 :param headers: 12:15:33 Dictionary of custom headers to send, such as User-Agent, 12:15:33 If-None-Match, etc. If None, pool headers are used. If provided, 12:15:33 these headers completely replace any pool-specific headers. 12:15:33 12:15:33 :param retries: 12:15:33 Configure the number of retries to allow before raising a 12:15:33 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:15:33 12:15:33 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:15:33 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:15:33 over different types of retries. 12:15:33 Pass an integer number to retry connection errors that many times, 12:15:33 but no other types of errors. Pass zero to never retry. 12:15:33 12:15:33 If ``False``, then retries are disabled and any exception is raised 12:15:33 immediately. Also, instead of raising a MaxRetryError on redirects, 12:15:33 the redirect response will be returned. 12:15:33 12:15:33 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:15:33 12:15:33 :param redirect: 12:15:33 If True, automatically handle redirects (status codes 301, 302, 12:15:33 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:15:33 will disable redirect, too. 12:15:33 12:15:33 :param assert_same_host: 12:15:33 If ``True``, will make sure that the host of the pool requests is 12:15:33 consistent else will raise HostChangedError. When ``False``, you can 12:15:33 use the pool on an HTTP proxy and request foreign hosts. 12:15:33 12:15:33 :param timeout: 12:15:33 If specified, overrides the default timeout for this one 12:15:33 request. It may be a float (in seconds) or an instance of 12:15:33 :class:`urllib3.util.Timeout`. 12:15:33 12:15:33 :param pool_timeout: 12:15:33 If set and the pool is set to block=True, then this method will 12:15:33 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:15:33 connection is available within the time period. 12:15:33 12:15:33 :param bool preload_content: 12:15:33 If True, the response's body will be preloaded into memory. 12:15:33 12:15:33 :param bool decode_content: 12:15:33 If True, will attempt to decode the body based on the 12:15:33 'content-encoding' header. 12:15:33 12:15:33 :param release_conn: 12:15:33 If False, then the urlopen call will not release the connection 12:15:33 back into the pool once a response is received (but will release if 12:15:33 you read the entire contents of the response such as when 12:15:33 `preload_content=True`). This is useful if you're not preloading 12:15:33 the response's content immediately. You will need to call 12:15:33 ``r.release_conn()`` on the response ``r`` to return the connection 12:15:33 back into the pool. If None, it takes the value of ``preload_content`` 12:15:33 which defaults to ``True``. 12:15:33 12:15:33 :param bool chunked: 12:15:33 If True, urllib3 will send the body using chunked transfer 12:15:33 encoding. Otherwise, urllib3 will send the body using the standard 12:15:33 content-length form. Defaults to False. 12:15:33 12:15:33 :param int body_pos: 12:15:33 Position to seek to in file-like body in the event of a retry or 12:15:33 redirect. Typically this won't need to be set because urllib3 will 12:15:33 auto-populate the value when needed. 12:15:33 """ 12:15:33 parsed_url = parse_url(url) 12:15:33 destination_scheme = parsed_url.scheme 12:15:33 12:15:33 if headers is None: 12:15:33 headers = self.headers 12:15:33 12:15:33 if not isinstance(retries, Retry): 12:15:33 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:15:33 12:15:33 if release_conn is None: 12:15:33 release_conn = preload_content 12:15:33 12:15:33 # Check host 12:15:33 if assert_same_host and not self.is_same_host(url): 12:15:33 raise HostChangedError(self, url, retries) 12:15:33 12:15:33 # Ensure that the URL we're connecting to is properly encoded 12:15:33 if url.startswith("/"): 12:15:33 url = to_str(_encode_target(url)) 12:15:33 else: 12:15:33 url = to_str(parsed_url.url) 12:15:33 12:15:33 conn = None 12:15:33 12:15:33 # Track whether `conn` needs to be released before 12:15:33 # returning/raising/recursing. Update this variable if necessary, and 12:15:33 # leave `release_conn` constant throughout the function. That way, if 12:15:33 # the function recurses, the original value of `release_conn` will be 12:15:33 # passed down into the recursive call, and its value will be respected. 12:15:33 # 12:15:33 # See issue #651 [1] for details. 12:15:33 # 12:15:33 # [1] 12:15:33 release_this_conn = release_conn 12:15:33 12:15:33 http_tunnel_required = connection_requires_http_tunnel( 12:15:33 self.proxy, self.proxy_config, destination_scheme 12:15:33 ) 12:15:33 12:15:33 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:15:33 # have to copy the headers dict so we can safely change it without those 12:15:33 # changes being reflected in anyone else's copy. 12:15:33 if not http_tunnel_required: 12:15:33 headers = headers.copy() # type: ignore[attr-defined] 12:15:33 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:15:33 12:15:33 # Must keep the exception bound to a separate variable or else Python 3 12:15:33 # complains about UnboundLocalError. 12:15:33 err = None 12:15:33 12:15:33 # Keep track of whether we cleanly exited the except block. This 12:15:33 # ensures we do proper cleanup in finally. 12:15:33 clean_exit = False 12:15:33 12:15:33 # Rewind body position, if needed. Record current position 12:15:33 # for future rewinds in the event of a redirect/retry. 12:15:33 body_pos = set_file_position(body, body_pos) 12:15:33 12:15:33 try: 12:15:33 # Request a connection from the queue. 12:15:33 timeout_obj = self._get_timeout(timeout) 12:15:33 conn = self._get_conn(timeout=pool_timeout) 12:15:33 12:15:33 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:15:33 12:15:33 # Is this a closed/new connection that requires CONNECT tunnelling? 12:15:33 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:15:33 try: 12:15:33 self._prepare_proxy(conn) 12:15:33 except (BaseSSLError, OSError, SocketTimeout) as e: 12:15:33 self._raise_timeout( 12:15:33 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:15:33 ) 12:15:33 raise 12:15:33 12:15:33 # If we're going to release the connection in ``finally:``, then 12:15:33 # the response doesn't need to know about the connection. Otherwise 12:15:33 # it will also try to release it and we'll have a double-release 12:15:33 # mess. 12:15:33 response_conn = conn if not release_conn else None 12:15:33 12:15:33 # Make the request on the HTTPConnection object 12:15:33 > response = self._make_request( 12:15:33 conn, 12:15:33 method, 12:15:33 url, 12:15:33 timeout=timeout_obj, 12:15:33 body=body, 12:15:33 headers=headers, 12:15:33 chunked=chunked, 12:15:33 retries=retries, 12:15:33 response_conn=response_conn, 12:15:33 preload_content=preload_content, 12:15:33 decode_content=decode_content, 12:15:33 **response_kw, 12:15:33 ) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request 12:15:33 conn.request( 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request 12:15:33 self.endheaders() 12:15:33 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders 12:15:33 self._send_output(message_body, encode_chunked=encode_chunked) 12:15:33 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output 12:15:33 self.send(msg) 12:15:33 /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send 12:15:33 self.connect() 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect 12:15:33 self.sock = self._new_conn() 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 self = 12:15:33 12:15:33 def _new_conn(self) -> socket.socket: 12:15:33 """Establish a socket connection and set nodelay settings on it. 12:15:33 12:15:33 :return: New socket connection. 12:15:33 """ 12:15:33 try: 12:15:33 sock = connection.create_connection( 12:15:33 (self._dns_host, self.port), 12:15:33 self.timeout, 12:15:33 source_address=self.source_address, 12:15:33 socket_options=self.socket_options, 12:15:33 ) 12:15:33 except socket.gaierror as e: 12:15:33 raise NameResolutionError(self.host, self, e) from e 12:15:33 except SocketTimeout as e: 12:15:33 raise ConnectTimeoutError( 12:15:33 self, 12:15:33 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:15:33 ) from e 12:15:33 12:15:33 except OSError as e: 12:15:33 > raise NewConnectionError( 12:15:33 self, f"Failed to establish a new connection: {e}" 12:15:33 ) from e 12:15:33 E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError 12:15:33 12:15:33 The above exception was the direct cause of the following exception: 12:15:33 12:15:33 self = 12:15:33 request = , stream = False 12:15:33 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:33 proxies = OrderedDict() 12:15:33 12:15:33 def send( 12:15:33 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:33 ): 12:15:33 """Sends PreparedRequest object. Returns Response object. 12:15:33 12:15:33 :param request: The :class:`PreparedRequest ` being sent. 12:15:33 :param stream: (optional) Whether to stream the request content. 12:15:33 :param timeout: (optional) How long to wait for the server to send 12:15:33 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:33 read timeout) ` tuple. 12:15:33 :type timeout: float or tuple or urllib3 Timeout object 12:15:33 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:33 we verify the server's TLS certificate, or a string, in which case it 12:15:33 must be a path to a CA bundle to use 12:15:33 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:33 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:33 :rtype: requests.Response 12:15:33 """ 12:15:33 12:15:33 try: 12:15:33 conn = self.get_connection_with_tls_context( 12:15:33 request, verify, proxies=proxies, cert=cert 12:15:33 ) 12:15:33 except LocationValueError as e: 12:15:33 raise InvalidURL(e, request=request) 12:15:33 12:15:33 self.cert_verify(conn, request.url, verify, cert) 12:15:33 url = self.request_url(request, proxies) 12:15:33 self.add_headers( 12:15:33 request, 12:15:33 stream=stream, 12:15:33 timeout=timeout, 12:15:33 verify=verify, 12:15:33 cert=cert, 12:15:33 proxies=proxies, 12:15:33 ) 12:15:33 12:15:33 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:33 12:15:33 if isinstance(timeout, tuple): 12:15:33 try: 12:15:33 connect, read = timeout 12:15:33 timeout = TimeoutSauce(connect=connect, read=read) 12:15:33 except ValueError: 12:15:33 raise ValueError( 12:15:33 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:33 f"or a single float to set both timeouts to the same value." 12:15:33 ) 12:15:33 elif isinstance(timeout, TimeoutSauce): 12:15:33 pass 12:15:33 else: 12:15:33 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:33 12:15:33 try: 12:15:33 > resp = conn.urlopen( 12:15:33 method=request.method, 12:15:33 url=url, 12:15:33 body=request.body, 12:15:33 headers=request.headers, 12:15:33 redirect=False, 12:15:33 assert_same_host=False, 12:15:33 preload_content=False, 12:15:33 decode_content=False, 12:15:33 retries=self.max_retries, 12:15:33 timeout=timeout, 12:15:33 chunked=chunked, 12:15:33 ) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen 12:15:33 retries = retries.increment( 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:15:33 method = 'GET' 12:15:33 url = '/rests/data/transportpce-portmapping:network/nodes=ROADMA01/node-info' 12:15:33 response = None 12:15:33 error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') 12:15:33 _pool = 12:15:33 _stacktrace = 12:15:33 12:15:33 def increment( 12:15:33 self, 12:15:33 method: str | None = None, 12:15:33 url: str | None = None, 12:15:33 response: BaseHTTPResponse | None = None, 12:15:33 error: Exception | None = None, 12:15:33 _pool: ConnectionPool | None = None, 12:15:33 _stacktrace: TracebackType | None = None, 12:15:33 ) -> Self: 12:15:33 """Return a new Retry object with incremented retry counters. 12:15:33 12:15:33 :param response: A response object, or None, if the server did not 12:15:33 return a response. 12:15:33 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:15:33 :param Exception error: An error encountered during the request, or 12:15:33 None if the response was received successfully. 12:15:33 12:15:33 :return: A new ``Retry`` object. 12:15:33 """ 12:15:33 if self.total is False and error: 12:15:33 # Disabled, indicate to re-raise the error. 12:15:33 raise reraise(type(error), error, _stacktrace) 12:15:33 12:15:33 total = self.total 12:15:33 if total is not None: 12:15:33 total -= 1 12:15:33 12:15:33 connect = self.connect 12:15:33 read = self.read 12:15:33 redirect = self.redirect 12:15:33 status_count = self.status 12:15:33 other = self.other 12:15:33 cause = "unknown" 12:15:33 status = None 12:15:33 redirect_location = None 12:15:33 12:15:33 if error and self._is_connection_error(error): 12:15:33 # Connect retry? 12:15:33 if connect is False: 12:15:33 raise reraise(type(error), error, _stacktrace) 12:15:33 elif connect is not None: 12:15:33 connect -= 1 12:15:33 12:15:33 elif error and self._is_read_error(error): 12:15:33 # Read retry? 12:15:33 if read is False or method is None or not self._is_method_retryable(method): 12:15:33 raise reraise(type(error), error, _stacktrace) 12:15:33 elif read is not None: 12:15:33 read -= 1 12:15:33 12:15:33 elif error: 12:15:33 # Other retry? 12:15:33 if other is not None: 12:15:33 other -= 1 12:15:33 12:15:33 elif response and response.get_redirect_location(): 12:15:33 # Redirect retry? 12:15:33 if redirect is not None: 12:15:33 redirect -= 1 12:15:33 cause = "too many redirects" 12:15:33 response_redirect_location = response.get_redirect_location() 12:15:33 if response_redirect_location: 12:15:33 redirect_location = response_redirect_location 12:15:33 status = response.status 12:15:33 12:15:33 else: 12:15:33 # Incrementing because of a server error like a 500 in 12:15:33 # status_forcelist and the given method is in the allowed_methods 12:15:33 cause = ResponseError.GENERIC_ERROR 12:15:33 if response and response.status: 12:15:33 if status_count is not None: 12:15:33 status_count -= 1 12:15:33 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:15:33 status = response.status 12:15:33 12:15:33 history = self.history + ( 12:15:33 RequestHistory(method, url, error, status, redirect_location), 12:15:33 ) 12:15:33 12:15:33 new_retry = self.new( 12:15:33 total=total, 12:15:33 connect=connect, 12:15:33 read=read, 12:15:33 redirect=redirect, 12:15:33 status=status_count, 12:15:33 other=other, 12:15:33 history=history, 12:15:33 ) 12:15:33 12:15:33 if new_retry.is_exhausted(): 12:15:33 reason = error or ResponseError(cause) 12:15:33 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:15:33 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=ROADMA01/node-info (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError 12:15:33 12:15:33 During handling of the above exception, another exception occurred: 12:15:33 12:15:33 self = 12:15:33 12:15:33 def test_21_rdm_device_not_connected(self): 12:15:33 > response = test_utils.get_portmapping_node_attr("ROADMA01", "node-info", None) 12:15:33 12:15:33 transportpce_tests/1.2.1/test01_portmapping.py:223: 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 transportpce_tests/common/test_utils.py:470: in get_portmapping_node_attr 12:15:33 response = get_request(target_url) 12:15:33 transportpce_tests/common/test_utils.py:116: in get_request 12:15:33 return requests.request( 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:15:33 return session.request(method=method, url=url, **kwargs) 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:15:33 resp = self.send(prep, **send_kwargs) 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:15:33 r = adapter.send(request, **kwargs) 12:15:33 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:15:33 12:15:33 self = 12:15:33 request = , stream = False 12:15:33 timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None 12:15:33 proxies = OrderedDict() 12:15:33 12:15:33 def send( 12:15:33 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:15:33 ): 12:15:33 """Sends PreparedRequest object. Returns Response object. 12:15:33 12:15:33 :param request: The :class:`PreparedRequest ` being sent. 12:15:33 :param stream: (optional) Whether to stream the request content. 12:15:33 :param timeout: (optional) How long to wait for the server to send 12:15:33 data before giving up, as a float, or a :ref:`(connect timeout, 12:15:33 read timeout) ` tuple. 12:15:33 :type timeout: float or tuple or urllib3 Timeout object 12:15:33 :param verify: (optional) Either a boolean, in which case it controls whether 12:15:33 we verify the server's TLS certificate, or a string, in which case it 12:15:33 must be a path to a CA bundle to use 12:15:33 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:15:33 :param proxies: (optional) The proxies dictionary to apply to the request. 12:15:33 :rtype: requests.Response 12:15:33 """ 12:15:33 12:15:33 try: 12:15:33 conn = self.get_connection_with_tls_context( 12:15:33 request, verify, proxies=proxies, cert=cert 12:15:33 ) 12:15:33 except LocationValueError as e: 12:15:33 raise InvalidURL(e, request=request) 12:15:33 12:15:33 self.cert_verify(conn, request.url, verify, cert) 12:15:33 url = self.request_url(request, proxies) 12:15:33 self.add_headers( 12:15:33 request, 12:15:33 stream=stream, 12:15:33 timeout=timeout, 12:15:33 verify=verify, 12:15:33 cert=cert, 12:15:33 proxies=proxies, 12:15:33 ) 12:15:33 12:15:33 chunked = not (request.body is None or "Content-Length" in request.headers) 12:15:33 12:15:33 if isinstance(timeout, tuple): 12:15:33 try: 12:15:33 connect, read = timeout 12:15:33 timeout = TimeoutSauce(connect=connect, read=read) 12:15:33 except ValueError: 12:15:33 raise ValueError( 12:15:33 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:15:33 f"or a single float to set both timeouts to the same value." 12:15:33 ) 12:15:33 elif isinstance(timeout, TimeoutSauce): 12:15:33 pass 12:15:33 else: 12:15:33 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:15:33 12:15:33 try: 12:15:33 resp = conn.urlopen( 12:15:33 method=request.method, 12:15:33 url=url, 12:15:33 body=request.body, 12:15:33 headers=request.headers, 12:15:33 redirect=False, 12:15:33 assert_same_host=False, 12:15:33 preload_content=False, 12:15:33 decode_content=False, 12:15:33 retries=self.max_retries, 12:15:33 timeout=timeout, 12:15:33 chunked=chunked, 12:15:33 ) 12:15:33 12:15:33 except (ProtocolError, OSError) as err: 12:15:33 raise ConnectionError(err, request=request) 12:15:33 12:15:33 except MaxRetryError as e: 12:15:33 if isinstance(e.reason, ConnectTimeoutError): 12:15:33 # TODO: Remove this in 3.0.0: see #2811 12:15:33 if not isinstance(e.reason, NewConnectionError): 12:15:33 raise ConnectTimeout(e, request=request) 12:15:33 12:15:33 if isinstance(e.reason, ResponseError): 12:15:33 raise RetryError(e, request=request) 12:15:33 12:15:33 if isinstance(e.reason, _ProxyError): 12:15:33 raise ProxyError(e, request=request) 12:15:33 12:15:33 if isinstance(e.reason, _SSLError): 12:15:33 # This branch is for urllib3 v1.22 and later. 12:15:33 raise SSLError(e, request=request) 12:15:33 12:15:33 > raise ConnectionError(e, request=request) 12:15:33 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=ROADMA01/node-info (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) 12:15:33 12:15:33 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError 12:15:33 ----------------------------- Captured stdout call ----------------------------- 12:15:33 execution of test_21_rdm_device_not_connected 12:15:33 --------------------------- Captured stdout teardown --------------------------- 12:15:33 all processes killed 12:15:33 =========================== short test summary info ============================ 12:15:33 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_09_xpdr_portmapping_info 12:15:33 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_10_xpdr_portmapping_NETWORK1 12:15:33 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_11_xpdr_portmapping_NETWORK2 12:15:33 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_12_xpdr_portmapping_CLIENT1 12:15:33 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_13_xpdr_portmapping_CLIENT2 12:15:33 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_14_xpdr_portmapping_CLIENT3 12:15:33 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_15_xpdr_portmapping_CLIENT4 12:15:33 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_16_xpdr_device_disconnection 12:15:33 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_17_xpdr_device_disconnected 12:15:33 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_18_xpdr_device_not_connected 12:15:33 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_19_rdm_device_disconnection 12:15:33 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_20_rdm_device_disconnected 12:15:33 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_21_rdm_device_not_connected 12:15:33 13 failed, 8 passed in 449.37s (0:07:29) 12:15:33 tests121: exit 1 (449.89 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 1.2.1 pid=36045 12:15:53 ............ [100%] 12:16:06 12 passed in 42.57s 12:16:06 pytest -q transportpce_tests/7.1/test02_otn_renderer.py 12:16:31 .............................................................. [100%] 12:18:41 62 passed in 154.70s (0:02:34) 12:18:41 pytest -q transportpce_tests/7.1/test03_renderer_or_modes.py 12:19:11 ................................................ [100%] 12:20:55 48 passed in 134.17s (0:02:14) 12:20:55 pytest -q transportpce_tests/7.1/test04_renderer_regen_mode.py 12:21:20 ...................... [100%] 12:22:07 22 passed in 71.80s (0:01:11) 12:22:08 tests121: FAIL ✖ in 7 minutes 36.68 seconds 12:22:08 tests71: OK ✔ in 6 minutes 50.04 seconds 12:22:08 tests221: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 12:22:08 tests_network: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 12:22:13 tests221: freeze> python -m pip freeze --all 12:22:13 tests_network: freeze> python -m pip freeze --all 12:22:13 tests221: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 12:22:13 tests221: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 2.2.1 12:22:13 using environment variables from ./karaf221.env 12:22:13 tests_network: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 12:22:13 tests_network: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh network 12:22:13 using environment variables from ./karaf221.env 12:22:13 pytest -q transportpce_tests/2.2.1/test01_portmapping.py 12:22:13 pytest -q transportpce_tests/network/test01_topo_extension.py 12:23:00 ................................... [100%] 12:23:40 35 passed in 86.14s (0:01:26) 12:23:40 pytest -q transportpce_tests/2.2.1/test02_topo_portmapping.py 12:24:07 ...... [100%] 12:24:19 6 passed in 39.42s 12:24:19 pytest -q transportpce_tests/2.2.1/test03_topology.py 12:24:35 EEEEEEEEEEEEEEEEEE [100%] 12:24:58 ==================================== ERRORS ==================================== 12:24:58 _________ ERROR at setup of TransportPCEtesting.test_01_connect_spdrA __________ 12:24:58 12:24:58 cls = 12:24:58 12:24:58 @classmethod 12:24:58 def setUpClass(cls): 12:24:58 # pylint: disable=unsubscriptable-object 12:24:58 cls.init_failed = False 12:24:58 os.environ['JAVA_MIN_MEM'] = '1024M' 12:24:58 os.environ['JAVA_MAX_MEM'] = '4096M' 12:24:58 cls.processes = test_utils.start_tpce() 12:24:58 # TAPI feature is not installed by default in Karaf 12:24:58 if 'NO_ODL_STARTUP' not in os.environ or 'USE_LIGHTY' not in os.environ or os.environ['USE_LIGHTY'] != 'True': 12:24:58 print('installing tapi feature...') 12:24:58 result = test_utils.install_karaf_feature('odl-transportpce-tapi') 12:24:58 if result.returncode != 0: 12:24:58 cls.init_failed = True 12:24:58 print('Restarting OpenDaylight...') 12:24:58 test_utils.shutdown_process(cls.processes[0]) 12:24:58 cls.processes[0] = test_utils.start_karaf() 12:24:58 test_utils.process_list[0] = cls.processes[0] 12:24:58 cls.init_failed = not test_utils.wait_until_log_contains( 12:24:58 test_utils.KARAF_LOG, test_utils.KARAF_OK_START_MSG, time_to_wait=60) 12:24:58 if cls.init_failed: 12:24:58 print('tapi installation feature failed...') 12:24:58 test_utils.shutdown_process(cls.processes[0]) 12:24:58 sys.exit(2) 12:24:58 > cls.processes = test_utils.start_sims([('spdra', cls.NODE_VERSION), 12:24:58 ('roadma', cls.NODE_VERSION), 12:24:58 ('roadmc', cls.NODE_VERSION), 12:24:58 ('spdrc', cls.NODE_VERSION)]) 12:24:58 12:24:58 transportpce_tests/network/test01_topo_extension.py:158: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 sims_list = [('spdra', '2.2.1'), ('roadma', '2.2.1'), ('roadmc', '2.2.1'), ('spdrc', '2.2.1')] 12:24:58 12:24:58 def start_sims(sims_list): 12:24:58 if SIMS_TO_USE == 'None': 12:24:58 return None 12:24:58 if SIMS_TO_USE == 'honeynode': 12:24:58 start_msg = HONEYNODE_OK_START_MSG 12:24:58 start_method = start_honeynode 12:24:58 else: 12:24:58 start_msg = LIGHTYNODE_OK_START_MSG 12:24:58 start_method = start_lightynode 12:24:58 for sim in sims_list: 12:24:58 print('starting simulator ' + sim[0] + ' in OpenROADM device version ' + sim[1] + '...') 12:24:58 log_file = os.path.join(SIM_LOG_DIRECTORY, SIMS[sim]['logfile']) 12:24:58 process = start_method(log_file, sim) 12:24:58 if wait_until_log_contains(log_file, start_msg, 100): 12:24:58 print('simulator for ' + sim[0] + ' started') 12:24:58 else: 12:24:58 print('simulator for ' + sim[0] + ' failed to start') 12:24:58 shutdown_process(process) 12:24:58 for pid in process_list: 12:24:58 shutdown_process(pid) 12:24:58 > sys.exit(3) 12:24:58 E SystemExit: 3 12:24:58 12:24:58 transportpce_tests/common/test_utils.py:206: SystemExit 12:24:58 ---------------------------- Captured stdout setup ----------------------------- 12:24:58 starting OpenDaylight... 12:24:58 starting KARAF TransportPCE build... 12:24:58 Searching for pattern 'Transportpce controller started' in karaf.log... Pattern found! OpenDaylight started ! 12:24:58 installing tapi feature... 12:24:58 installing feature odl-transportpce-tapi 12:24:58 client: JAVA_HOME not set; results may vary 12:24:58 odl-transportpce-tapi │ 10.0.0.SNAPSHOT │ x │ Started │ odl-transportpce-tapi │ OpenDaylight :: transportpce :: tapi 12:24:58 Restarting OpenDaylight... 12:24:58 starting KARAF TransportPCE build... 12:24:58 Searching for pattern 'Transportpce controller started' in karaf.log... Pattern found! starting simulator spdra in OpenROADM device version 2.2.1... 12:24:58 Searching for pattern 'Data tree change listeners registered' in spdra-221.log... Pattern found! simulator for spdra started 12:24:58 starting simulator roadma in OpenROADM device version 2.2.1... 12:24:58 Searching for pattern 'Data tree change listeners registered' in roadma-221.log... Pattern not found after 100 seconds! simulator for roadma failed to start 12:24:58 ---------------------------- Captured stderr setup ----------------------------- 12:24:58 SLF4J(W): No SLF4J providers were found. 12:24:58 SLF4J(W): Defaulting to no-operation (NOP) logger implementation 12:24:58 SLF4J(W): See https://www.slf4j.org/codes.html#noProviders for further details. 12:24:58 SLF4J(W): Class path contains SLF4J bindings targeting slf4j-api versions 1.7.x or earlier. 12:24:58 SLF4J(W): Ignoring binding found at [jar:file:/w/workspace/transportpce-tox-verify-transportpce-master/karaf221/target/assembly/system/org/apache/karaf/org.apache.karaf.client/4.4.6/org.apache.karaf.client-4.4.6.jar!/org/slf4j/impl/StaticLoggerBinder.class] 12:24:58 SLF4J(W): See https://www.slf4j.org/codes.html#ignoredBindings for an explanation. 12:24:58 _________ ERROR at setup of TransportPCEtesting.test_02_connect_spdrC __________ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 > return fun(self, *args, **kwargs) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:508: in wrapper 12:24:58 raise raise_from(err, None) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:506: in wrapper 12:24:58 return fun(self) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1780: in _parse_stat_file 12:24:58 data = bcat("%s/%s/stat" % (self._procfs_path, self.pid)) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:851: in bcat 12:24:58 return cat(fname, fallback=fallback, _open=open_binary) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:839: in cat 12:24:58 with _open(fname) as f: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 fname = '/proc/41727/stat' 12:24:58 12:24:58 def open_binary(fname): 12:24:58 > return open(fname, "rb", buffering=FILE_READ_BUFFER_SIZE) 12:24:58 E FileNotFoundError: [Errno 2] No such file or directory: '/proc/41727/stat' 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:799: FileNotFoundError 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 > self.create_time() 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:355: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:757: in create_time 12:24:58 self._create_time = self._proc.create_time() 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: in wrapper 12:24:58 return fun(self, *args, **kwargs) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1948: in create_time 12:24:58 ctime = float(self._parse_stat_file()['create_time']) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 return fun(self, *args, **kwargs) 12:24:58 except PermissionError: 12:24:58 raise AccessDenied(self.pid, self._name) 12:24:58 except ProcessLookupError: 12:24:58 self._raise_if_zombie() 12:24:58 raise NoSuchProcess(self.pid, self._name) 12:24:58 except FileNotFoundError: 12:24:58 self._raise_if_zombie() 12:24:58 if not os.path.exists("%s/%s" % (self._procfs_path, self.pid)): 12:24:58 > raise NoSuchProcess(self.pid, self._name) 12:24:58 E psutil.NoSuchProcess: process no longer exists (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1726: NoSuchProcess 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 cls = 12:24:58 12:24:58 @classmethod 12:24:58 def setUpClass(cls): 12:24:58 # pylint: disable=unsubscriptable-object 12:24:58 cls.init_failed = False 12:24:58 os.environ['JAVA_MIN_MEM'] = '1024M' 12:24:58 os.environ['JAVA_MAX_MEM'] = '4096M' 12:24:58 cls.processes = test_utils.start_tpce() 12:24:58 # TAPI feature is not installed by default in Karaf 12:24:58 if 'NO_ODL_STARTUP' not in os.environ or 'USE_LIGHTY' not in os.environ or os.environ['USE_LIGHTY'] != 'True': 12:24:58 print('installing tapi feature...') 12:24:58 result = test_utils.install_karaf_feature('odl-transportpce-tapi') 12:24:58 if result.returncode != 0: 12:24:58 cls.init_failed = True 12:24:58 print('Restarting OpenDaylight...') 12:24:58 > test_utils.shutdown_process(cls.processes[0]) 12:24:58 12:24:58 transportpce_tests/network/test01_topo_extension.py:149: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 transportpce_tests/common/test_utils.py:270: in shutdown_process 12:24:58 for child in psutil.Process(process.pid).children(): 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:319: in __init__ 12:24:58 self._init(pid) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 self.create_time() 12:24:58 except AccessDenied: 12:24:58 # We should never get here as AFAIK we're able to get 12:24:58 # process creation time on all platforms even as a 12:24:58 # limited user. 12:24:58 pass 12:24:58 except ZombieProcess: 12:24:58 # Zombies can still be queried by this class (although 12:24:58 # not always) and pids() return them so just go on. 12:24:58 pass 12:24:58 except NoSuchProcess: 12:24:58 if not _ignore_nsp: 12:24:58 msg = "process PID not found" 12:24:58 > raise NoSuchProcess(pid, msg=msg) 12:24:58 E psutil.NoSuchProcess: process PID not found (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:368: NoSuchProcess 12:24:58 ---------------------------- Captured stdout setup ----------------------------- 12:24:58 starting OpenDaylight... 12:24:58 starting KARAF TransportPCE build... 12:24:58 Searching for pattern 'Transportpce controller started' in karaf.log... Pattern found! OpenDaylight started ! 12:24:58 installing tapi feature... 12:24:58 installing feature odl-transportpce-tapi 12:24:58 client: JAVA_HOME not set; results may vary 12:24:58 odl-transportpce-tapi │ 10.0.0.SNAPSHOT │ x │ Started │ odl-transportpce-tapi │ OpenDaylight :: transportpce :: tapi 12:24:58 Restarting OpenDaylight... 12:24:58 ---------------------------- Captured stderr setup ----------------------------- 12:24:58 SLF4J(W): No SLF4J providers were found. 12:24:58 SLF4J(W): Defaulting to no-operation (NOP) logger implementation 12:24:58 SLF4J(W): See https://www.slf4j.org/codes.html#noProviders for further details. 12:24:58 SLF4J(W): Class path contains SLF4J bindings targeting slf4j-api versions 1.7.x or earlier. 12:24:58 SLF4J(W): Ignoring binding found at [jar:file:/w/workspace/transportpce-tox-verify-transportpce-master/karaf221/target/assembly/system/org/apache/karaf/org.apache.karaf.client/4.4.6/org.apache.karaf.client-4.4.6.jar!/org/slf4j/impl/StaticLoggerBinder.class] 12:24:58 SLF4J(W): See https://www.slf4j.org/codes.html#ignoredBindings for an explanation. 12:24:58 __________ ERROR at setup of TransportPCEtesting.test_03_connect_rdmA __________ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 > return fun(self, *args, **kwargs) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:508: in wrapper 12:24:58 raise raise_from(err, None) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:506: in wrapper 12:24:58 return fun(self) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1780: in _parse_stat_file 12:24:58 data = bcat("%s/%s/stat" % (self._procfs_path, self.pid)) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:851: in bcat 12:24:58 return cat(fname, fallback=fallback, _open=open_binary) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:839: in cat 12:24:58 with _open(fname) as f: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 fname = '/proc/41727/stat' 12:24:58 12:24:58 def open_binary(fname): 12:24:58 > return open(fname, "rb", buffering=FILE_READ_BUFFER_SIZE) 12:24:58 E FileNotFoundError: [Errno 2] No such file or directory: '/proc/41727/stat' 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:799: FileNotFoundError 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 > self.create_time() 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:355: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:757: in create_time 12:24:58 self._create_time = self._proc.create_time() 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: in wrapper 12:24:58 return fun(self, *args, **kwargs) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1948: in create_time 12:24:58 ctime = float(self._parse_stat_file()['create_time']) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 return fun(self, *args, **kwargs) 12:24:58 except PermissionError: 12:24:58 raise AccessDenied(self.pid, self._name) 12:24:58 except ProcessLookupError: 12:24:58 self._raise_if_zombie() 12:24:58 raise NoSuchProcess(self.pid, self._name) 12:24:58 except FileNotFoundError: 12:24:58 self._raise_if_zombie() 12:24:58 if not os.path.exists("%s/%s" % (self._procfs_path, self.pid)): 12:24:58 > raise NoSuchProcess(self.pid, self._name) 12:24:58 E psutil.NoSuchProcess: process no longer exists (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1726: NoSuchProcess 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 cls = 12:24:58 12:24:58 @classmethod 12:24:58 def setUpClass(cls): 12:24:58 # pylint: disable=unsubscriptable-object 12:24:58 cls.init_failed = False 12:24:58 os.environ['JAVA_MIN_MEM'] = '1024M' 12:24:58 os.environ['JAVA_MAX_MEM'] = '4096M' 12:24:58 cls.processes = test_utils.start_tpce() 12:24:58 # TAPI feature is not installed by default in Karaf 12:24:58 if 'NO_ODL_STARTUP' not in os.environ or 'USE_LIGHTY' not in os.environ or os.environ['USE_LIGHTY'] != 'True': 12:24:58 print('installing tapi feature...') 12:24:58 result = test_utils.install_karaf_feature('odl-transportpce-tapi') 12:24:58 if result.returncode != 0: 12:24:58 cls.init_failed = True 12:24:58 print('Restarting OpenDaylight...') 12:24:58 > test_utils.shutdown_process(cls.processes[0]) 12:24:58 12:24:58 transportpce_tests/network/test01_topo_extension.py:149: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 transportpce_tests/common/test_utils.py:270: in shutdown_process 12:24:58 for child in psutil.Process(process.pid).children(): 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:319: in __init__ 12:24:58 self._init(pid) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 self.create_time() 12:24:58 except AccessDenied: 12:24:58 # We should never get here as AFAIK we're able to get 12:24:58 # process creation time on all platforms even as a 12:24:58 # limited user. 12:24:58 pass 12:24:58 except ZombieProcess: 12:24:58 # Zombies can still be queried by this class (although 12:24:58 # not always) and pids() return them so just go on. 12:24:58 pass 12:24:58 except NoSuchProcess: 12:24:58 if not _ignore_nsp: 12:24:58 msg = "process PID not found" 12:24:58 > raise NoSuchProcess(pid, msg=msg) 12:24:58 E psutil.NoSuchProcess: process PID not found (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:368: NoSuchProcess 12:24:58 __________ ERROR at setup of TransportPCEtesting.test_04_connect_rdmC __________ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 > return fun(self, *args, **kwargs) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:508: in wrapper 12:24:58 raise raise_from(err, None) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:506: in wrapper 12:24:58 return fun(self) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1780: in _parse_stat_file 12:24:58 data = bcat("%s/%s/stat" % (self._procfs_path, self.pid)) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:851: in bcat 12:24:58 return cat(fname, fallback=fallback, _open=open_binary) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:839: in cat 12:24:58 with _open(fname) as f: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 fname = '/proc/41727/stat' 12:24:58 12:24:58 def open_binary(fname): 12:24:58 > return open(fname, "rb", buffering=FILE_READ_BUFFER_SIZE) 12:24:58 E FileNotFoundError: [Errno 2] No such file or directory: '/proc/41727/stat' 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:799: FileNotFoundError 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 > self.create_time() 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:355: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:757: in create_time 12:24:58 self._create_time = self._proc.create_time() 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: in wrapper 12:24:58 return fun(self, *args, **kwargs) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1948: in create_time 12:24:58 ctime = float(self._parse_stat_file()['create_time']) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 return fun(self, *args, **kwargs) 12:24:58 except PermissionError: 12:24:58 raise AccessDenied(self.pid, self._name) 12:24:58 except ProcessLookupError: 12:24:58 self._raise_if_zombie() 12:24:58 raise NoSuchProcess(self.pid, self._name) 12:24:58 except FileNotFoundError: 12:24:58 self._raise_if_zombie() 12:24:58 if not os.path.exists("%s/%s" % (self._procfs_path, self.pid)): 12:24:58 > raise NoSuchProcess(self.pid, self._name) 12:24:58 E psutil.NoSuchProcess: process no longer exists (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1726: NoSuchProcess 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 cls = 12:24:58 12:24:58 @classmethod 12:24:58 def setUpClass(cls): 12:24:58 # pylint: disable=unsubscriptable-object 12:24:58 cls.init_failed = False 12:24:58 os.environ['JAVA_MIN_MEM'] = '1024M' 12:24:58 os.environ['JAVA_MAX_MEM'] = '4096M' 12:24:58 cls.processes = test_utils.start_tpce() 12:24:58 # TAPI feature is not installed by default in Karaf 12:24:58 if 'NO_ODL_STARTUP' not in os.environ or 'USE_LIGHTY' not in os.environ or os.environ['USE_LIGHTY'] != 'True': 12:24:58 print('installing tapi feature...') 12:24:58 result = test_utils.install_karaf_feature('odl-transportpce-tapi') 12:24:58 if result.returncode != 0: 12:24:58 cls.init_failed = True 12:24:58 print('Restarting OpenDaylight...') 12:24:58 > test_utils.shutdown_process(cls.processes[0]) 12:24:58 12:24:58 transportpce_tests/network/test01_topo_extension.py:149: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 transportpce_tests/common/test_utils.py:270: in shutdown_process 12:24:58 for child in psutil.Process(process.pid).children(): 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:319: in __init__ 12:24:58 self._init(pid) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 self.create_time() 12:24:58 except AccessDenied: 12:24:58 # We should never get here as AFAIK we're able to get 12:24:58 # process creation time on all platforms even as a 12:24:58 # limited user. 12:24:58 pass 12:24:58 except ZombieProcess: 12:24:58 # Zombies can still be queried by this class (although 12:24:58 # not always) and pids() return them so just go on. 12:24:58 pass 12:24:58 except NoSuchProcess: 12:24:58 if not _ignore_nsp: 12:24:58 msg = "process PID not found" 12:24:58 > raise NoSuchProcess(pid, msg=msg) 12:24:58 E psutil.NoSuchProcess: process PID not found (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:368: NoSuchProcess 12:24:58 _ ERROR at setup of TransportPCEtesting.test_05_connect_sprdA_1_N1_to_TAPI_EXT_roadmTA1_PP1 _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 > return fun(self, *args, **kwargs) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:508: in wrapper 12:24:58 raise raise_from(err, None) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:506: in wrapper 12:24:58 return fun(self) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1780: in _parse_stat_file 12:24:58 data = bcat("%s/%s/stat" % (self._procfs_path, self.pid)) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:851: in bcat 12:24:58 return cat(fname, fallback=fallback, _open=open_binary) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:839: in cat 12:24:58 with _open(fname) as f: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 fname = '/proc/41727/stat' 12:24:58 12:24:58 def open_binary(fname): 12:24:58 > return open(fname, "rb", buffering=FILE_READ_BUFFER_SIZE) 12:24:58 E FileNotFoundError: [Errno 2] No such file or directory: '/proc/41727/stat' 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:799: FileNotFoundError 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 > self.create_time() 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:355: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:757: in create_time 12:24:58 self._create_time = self._proc.create_time() 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: in wrapper 12:24:58 return fun(self, *args, **kwargs) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1948: in create_time 12:24:58 ctime = float(self._parse_stat_file()['create_time']) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 return fun(self, *args, **kwargs) 12:24:58 except PermissionError: 12:24:58 raise AccessDenied(self.pid, self._name) 12:24:58 except ProcessLookupError: 12:24:58 self._raise_if_zombie() 12:24:58 raise NoSuchProcess(self.pid, self._name) 12:24:58 except FileNotFoundError: 12:24:58 self._raise_if_zombie() 12:24:58 if not os.path.exists("%s/%s" % (self._procfs_path, self.pid)): 12:24:58 > raise NoSuchProcess(self.pid, self._name) 12:24:58 E psutil.NoSuchProcess: process no longer exists (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1726: NoSuchProcess 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 cls = 12:24:58 12:24:58 @classmethod 12:24:58 def setUpClass(cls): 12:24:58 # pylint: disable=unsubscriptable-object 12:24:58 cls.init_failed = False 12:24:58 os.environ['JAVA_MIN_MEM'] = '1024M' 12:24:58 os.environ['JAVA_MAX_MEM'] = '4096M' 12:24:58 cls.processes = test_utils.start_tpce() 12:24:58 # TAPI feature is not installed by default in Karaf 12:24:58 if 'NO_ODL_STARTUP' not in os.environ or 'USE_LIGHTY' not in os.environ or os.environ['USE_LIGHTY'] != 'True': 12:24:58 print('installing tapi feature...') 12:24:58 result = test_utils.install_karaf_feature('odl-transportpce-tapi') 12:24:58 if result.returncode != 0: 12:24:58 cls.init_failed = True 12:24:58 print('Restarting OpenDaylight...') 12:24:58 > test_utils.shutdown_process(cls.processes[0]) 12:24:58 12:24:58 transportpce_tests/network/test01_topo_extension.py:149: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 transportpce_tests/common/test_utils.py:270: in shutdown_process 12:24:58 for child in psutil.Process(process.pid).children(): 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:319: in __init__ 12:24:58 self._init(pid) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 self.create_time() 12:24:58 except AccessDenied: 12:24:58 # We should never get here as AFAIK we're able to get 12:24:58 # process creation time on all platforms even as a 12:24:58 # limited user. 12:24:58 pass 12:24:58 except ZombieProcess: 12:24:58 # Zombies can still be queried by this class (although 12:24:58 # not always) and pids() return them so just go on. 12:24:58 pass 12:24:58 except NoSuchProcess: 12:24:58 if not _ignore_nsp: 12:24:58 msg = "process PID not found" 12:24:58 > raise NoSuchProcess(pid, msg=msg) 12:24:58 E psutil.NoSuchProcess: process PID not found (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:368: NoSuchProcess 12:24:58 _ ERROR at setup of TransportPCEtesting.test_06_connect_TAPI_EXT_roadmTA1_PP1_to_spdrA_1_N1 _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 > return fun(self, *args, **kwargs) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:508: in wrapper 12:24:58 raise raise_from(err, None) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:506: in wrapper 12:24:58 return fun(self) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1780: in _parse_stat_file 12:24:58 data = bcat("%s/%s/stat" % (self._procfs_path, self.pid)) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:851: in bcat 12:24:58 return cat(fname, fallback=fallback, _open=open_binary) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:839: in cat 12:24:58 with _open(fname) as f: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 fname = '/proc/41727/stat' 12:24:58 12:24:58 def open_binary(fname): 12:24:58 > return open(fname, "rb", buffering=FILE_READ_BUFFER_SIZE) 12:24:58 E FileNotFoundError: [Errno 2] No such file or directory: '/proc/41727/stat' 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:799: FileNotFoundError 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 > self.create_time() 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:355: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:757: in create_time 12:24:58 self._create_time = self._proc.create_time() 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: in wrapper 12:24:58 return fun(self, *args, **kwargs) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1948: in create_time 12:24:58 ctime = float(self._parse_stat_file()['create_time']) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 return fun(self, *args, **kwargs) 12:24:58 except PermissionError: 12:24:58 raise AccessDenied(self.pid, self._name) 12:24:58 except ProcessLookupError: 12:24:58 self._raise_if_zombie() 12:24:58 raise NoSuchProcess(self.pid, self._name) 12:24:58 except FileNotFoundError: 12:24:58 self._raise_if_zombie() 12:24:58 if not os.path.exists("%s/%s" % (self._procfs_path, self.pid)): 12:24:58 > raise NoSuchProcess(self.pid, self._name) 12:24:58 E psutil.NoSuchProcess: process no longer exists (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1726: NoSuchProcess 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 cls = 12:24:58 12:24:58 @classmethod 12:24:58 def setUpClass(cls): 12:24:58 # pylint: disable=unsubscriptable-object 12:24:58 cls.init_failed = False 12:24:58 os.environ['JAVA_MIN_MEM'] = '1024M' 12:24:58 os.environ['JAVA_MAX_MEM'] = '4096M' 12:24:58 cls.processes = test_utils.start_tpce() 12:24:58 # TAPI feature is not installed by default in Karaf 12:24:58 if 'NO_ODL_STARTUP' not in os.environ or 'USE_LIGHTY' not in os.environ or os.environ['USE_LIGHTY'] != 'True': 12:24:58 print('installing tapi feature...') 12:24:58 result = test_utils.install_karaf_feature('odl-transportpce-tapi') 12:24:58 if result.returncode != 0: 12:24:58 cls.init_failed = True 12:24:58 print('Restarting OpenDaylight...') 12:24:58 > test_utils.shutdown_process(cls.processes[0]) 12:24:58 12:24:58 transportpce_tests/network/test01_topo_extension.py:149: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 transportpce_tests/common/test_utils.py:270: in shutdown_process 12:24:58 for child in psutil.Process(process.pid).children(): 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:319: in __init__ 12:24:58 self._init(pid) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 self.create_time() 12:24:58 except AccessDenied: 12:24:58 # We should never get here as AFAIK we're able to get 12:24:58 # process creation time on all platforms even as a 12:24:58 # limited user. 12:24:58 pass 12:24:58 except ZombieProcess: 12:24:58 # Zombies can still be queried by this class (although 12:24:58 # not always) and pids() return them so just go on. 12:24:58 pass 12:24:58 except NoSuchProcess: 12:24:58 if not _ignore_nsp: 12:24:58 msg = "process PID not found" 12:24:58 > raise NoSuchProcess(pid, msg=msg) 12:24:58 E psutil.NoSuchProcess: process PID not found (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:368: NoSuchProcess 12:24:58 _ ERROR at setup of TransportPCEtesting.test_07_connect_sprdC_1_N1_to_TAPI_EXT_roadmTC1_PP1 _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 > return fun(self, *args, **kwargs) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:508: in wrapper 12:24:58 raise raise_from(err, None) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:506: in wrapper 12:24:58 return fun(self) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1780: in _parse_stat_file 12:24:58 data = bcat("%s/%s/stat" % (self._procfs_path, self.pid)) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:851: in bcat 12:24:58 return cat(fname, fallback=fallback, _open=open_binary) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:839: in cat 12:24:58 with _open(fname) as f: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 fname = '/proc/41727/stat' 12:24:58 12:24:58 def open_binary(fname): 12:24:58 > return open(fname, "rb", buffering=FILE_READ_BUFFER_SIZE) 12:24:58 E FileNotFoundError: [Errno 2] No such file or directory: '/proc/41727/stat' 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:799: FileNotFoundError 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 > self.create_time() 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:355: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:757: in create_time 12:24:58 self._create_time = self._proc.create_time() 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: in wrapper 12:24:58 return fun(self, *args, **kwargs) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1948: in create_time 12:24:58 ctime = float(self._parse_stat_file()['create_time']) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 return fun(self, *args, **kwargs) 12:24:58 except PermissionError: 12:24:58 raise AccessDenied(self.pid, self._name) 12:24:58 except ProcessLookupError: 12:24:58 self._raise_if_zombie() 12:24:58 raise NoSuchProcess(self.pid, self._name) 12:24:58 except FileNotFoundError: 12:24:58 self._raise_if_zombie() 12:24:58 if not os.path.exists("%s/%s" % (self._procfs_path, self.pid)): 12:24:58 > raise NoSuchProcess(self.pid, self._name) 12:24:58 E psutil.NoSuchProcess: process no longer exists (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1726: NoSuchProcess 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 cls = 12:24:58 12:24:58 @classmethod 12:24:58 def setUpClass(cls): 12:24:58 # pylint: disable=unsubscriptable-object 12:24:58 cls.init_failed = False 12:24:58 os.environ['JAVA_MIN_MEM'] = '1024M' 12:24:58 os.environ['JAVA_MAX_MEM'] = '4096M' 12:24:58 cls.processes = test_utils.start_tpce() 12:24:58 # TAPI feature is not installed by default in Karaf 12:24:58 if 'NO_ODL_STARTUP' not in os.environ or 'USE_LIGHTY' not in os.environ or os.environ['USE_LIGHTY'] != 'True': 12:24:58 print('installing tapi feature...') 12:24:58 result = test_utils.install_karaf_feature('odl-transportpce-tapi') 12:24:58 if result.returncode != 0: 12:24:58 cls.init_failed = True 12:24:58 print('Restarting OpenDaylight...') 12:24:58 > test_utils.shutdown_process(cls.processes[0]) 12:24:58 12:24:58 transportpce_tests/network/test01_topo_extension.py:149: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 transportpce_tests/common/test_utils.py:270: in shutdown_process 12:24:58 for child in psutil.Process(process.pid).children(): 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:319: in __init__ 12:24:58 self._init(pid) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 self.create_time() 12:24:58 except AccessDenied: 12:24:58 # We should never get here as AFAIK we're able to get 12:24:58 # process creation time on all platforms even as a 12:24:58 # limited user. 12:24:58 pass 12:24:58 except ZombieProcess: 12:24:58 # Zombies can still be queried by this class (although 12:24:58 # not always) and pids() return them so just go on. 12:24:58 pass 12:24:58 except NoSuchProcess: 12:24:58 if not _ignore_nsp: 12:24:58 msg = "process PID not found" 12:24:58 > raise NoSuchProcess(pid, msg=msg) 12:24:58 E psutil.NoSuchProcess: process PID not found (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:368: NoSuchProcess 12:24:58 _ ERROR at setup of TransportPCEtesting.test_08_connect_TAPI_EXT_roadmTC1_PP1_to_spdrC_1_N1 _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 > return fun(self, *args, **kwargs) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:508: in wrapper 12:24:58 raise raise_from(err, None) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:506: in wrapper 12:24:58 return fun(self) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1780: in _parse_stat_file 12:24:58 data = bcat("%s/%s/stat" % (self._procfs_path, self.pid)) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:851: in bcat 12:24:58 return cat(fname, fallback=fallback, _open=open_binary) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:839: in cat 12:24:58 with _open(fname) as f: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 fname = '/proc/41727/stat' 12:24:58 12:24:58 def open_binary(fname): 12:24:58 > return open(fname, "rb", buffering=FILE_READ_BUFFER_SIZE) 12:24:58 E FileNotFoundError: [Errno 2] No such file or directory: '/proc/41727/stat' 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:799: FileNotFoundError 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 > self.create_time() 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:355: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:757: in create_time 12:24:58 self._create_time = self._proc.create_time() 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: in wrapper 12:24:58 return fun(self, *args, **kwargs) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1948: in create_time 12:24:58 ctime = float(self._parse_stat_file()['create_time']) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 return fun(self, *args, **kwargs) 12:24:58 except PermissionError: 12:24:58 raise AccessDenied(self.pid, self._name) 12:24:58 except ProcessLookupError: 12:24:58 self._raise_if_zombie() 12:24:58 raise NoSuchProcess(self.pid, self._name) 12:24:58 except FileNotFoundError: 12:24:58 self._raise_if_zombie() 12:24:58 if not os.path.exists("%s/%s" % (self._procfs_path, self.pid)): 12:24:58 > raise NoSuchProcess(self.pid, self._name) 12:24:58 E psutil.NoSuchProcess: process no longer exists (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1726: NoSuchProcess 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 cls = 12:24:58 12:24:58 @classmethod 12:24:58 def setUpClass(cls): 12:24:58 # pylint: disable=unsubscriptable-object 12:24:58 cls.init_failed = False 12:24:58 os.environ['JAVA_MIN_MEM'] = '1024M' 12:24:58 os.environ['JAVA_MAX_MEM'] = '4096M' 12:24:58 cls.processes = test_utils.start_tpce() 12:24:58 # TAPI feature is not installed by default in Karaf 12:24:58 if 'NO_ODL_STARTUP' not in os.environ or 'USE_LIGHTY' not in os.environ or os.environ['USE_LIGHTY'] != 'True': 12:24:58 print('installing tapi feature...') 12:24:58 result = test_utils.install_karaf_feature('odl-transportpce-tapi') 12:24:58 if result.returncode != 0: 12:24:58 cls.init_failed = True 12:24:58 print('Restarting OpenDaylight...') 12:24:58 > test_utils.shutdown_process(cls.processes[0]) 12:24:58 12:24:58 transportpce_tests/network/test01_topo_extension.py:149: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 transportpce_tests/common/test_utils.py:270: in shutdown_process 12:24:58 for child in psutil.Process(process.pid).children(): 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:319: in __init__ 12:24:58 self._init(pid) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 self.create_time() 12:24:58 except AccessDenied: 12:24:58 # We should never get here as AFAIK we're able to get 12:24:58 # process creation time on all platforms even as a 12:24:58 # limited user. 12:24:58 pass 12:24:58 except ZombieProcess: 12:24:58 # Zombies can still be queried by this class (although 12:24:58 # not always) and pids() return them so just go on. 12:24:58 pass 12:24:58 except NoSuchProcess: 12:24:58 if not _ignore_nsp: 12:24:58 msg = "process PID not found" 12:24:58 > raise NoSuchProcess(pid, msg=msg) 12:24:58 E psutil.NoSuchProcess: process PID not found (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:368: NoSuchProcess 12:24:58 _______ ERROR at setup of TransportPCEtesting.test_09_check_otn_topology _______ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 > return fun(self, *args, **kwargs) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:508: in wrapper 12:24:58 raise raise_from(err, None) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:506: in wrapper 12:24:58 return fun(self) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1780: in _parse_stat_file 12:24:58 data = bcat("%s/%s/stat" % (self._procfs_path, self.pid)) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:851: in bcat 12:24:58 return cat(fname, fallback=fallback, _open=open_binary) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:839: in cat 12:24:58 with _open(fname) as f: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 fname = '/proc/41727/stat' 12:24:58 12:24:58 def open_binary(fname): 12:24:58 > return open(fname, "rb", buffering=FILE_READ_BUFFER_SIZE) 12:24:58 E FileNotFoundError: [Errno 2] No such file or directory: '/proc/41727/stat' 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:799: FileNotFoundError 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 > self.create_time() 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:355: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:757: in create_time 12:24:58 self._create_time = self._proc.create_time() 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: in wrapper 12:24:58 return fun(self, *args, **kwargs) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1948: in create_time 12:24:58 ctime = float(self._parse_stat_file()['create_time']) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 return fun(self, *args, **kwargs) 12:24:58 except PermissionError: 12:24:58 raise AccessDenied(self.pid, self._name) 12:24:58 except ProcessLookupError: 12:24:58 self._raise_if_zombie() 12:24:58 raise NoSuchProcess(self.pid, self._name) 12:24:58 except FileNotFoundError: 12:24:58 self._raise_if_zombie() 12:24:58 if not os.path.exists("%s/%s" % (self._procfs_path, self.pid)): 12:24:58 > raise NoSuchProcess(self.pid, self._name) 12:24:58 E psutil.NoSuchProcess: process no longer exists (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1726: NoSuchProcess 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 cls = 12:24:58 12:24:58 @classmethod 12:24:58 def setUpClass(cls): 12:24:58 # pylint: disable=unsubscriptable-object 12:24:58 cls.init_failed = False 12:24:58 os.environ['JAVA_MIN_MEM'] = '1024M' 12:24:58 os.environ['JAVA_MAX_MEM'] = '4096M' 12:24:58 cls.processes = test_utils.start_tpce() 12:24:58 # TAPI feature is not installed by default in Karaf 12:24:58 if 'NO_ODL_STARTUP' not in os.environ or 'USE_LIGHTY' not in os.environ or os.environ['USE_LIGHTY'] != 'True': 12:24:58 print('installing tapi feature...') 12:24:58 result = test_utils.install_karaf_feature('odl-transportpce-tapi') 12:24:58 if result.returncode != 0: 12:24:58 cls.init_failed = True 12:24:58 print('Restarting OpenDaylight...') 12:24:58 > test_utils.shutdown_process(cls.processes[0]) 12:24:58 12:24:58 transportpce_tests/network/test01_topo_extension.py:149: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 transportpce_tests/common/test_utils.py:270: in shutdown_process 12:24:58 for child in psutil.Process(process.pid).children(): 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:319: in __init__ 12:24:58 self._init(pid) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 self.create_time() 12:24:58 except AccessDenied: 12:24:58 # We should never get here as AFAIK we're able to get 12:24:58 # process creation time on all platforms even as a 12:24:58 # limited user. 12:24:58 pass 12:24:58 except ZombieProcess: 12:24:58 # Zombies can still be queried by this class (although 12:24:58 # not always) and pids() return them so just go on. 12:24:58 pass 12:24:58 except NoSuchProcess: 12:24:58 if not _ignore_nsp: 12:24:58 msg = "process PID not found" 12:24:58 > raise NoSuchProcess(pid, msg=msg) 12:24:58 E psutil.NoSuchProcess: process PID not found (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:368: NoSuchProcess 12:24:58 ____ ERROR at setup of TransportPCEtesting.test_10_check_openroadm_topology ____ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 > return fun(self, *args, **kwargs) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:508: in wrapper 12:24:58 raise raise_from(err, None) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:506: in wrapper 12:24:58 return fun(self) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1780: in _parse_stat_file 12:24:58 data = bcat("%s/%s/stat" % (self._procfs_path, self.pid)) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:851: in bcat 12:24:58 return cat(fname, fallback=fallback, _open=open_binary) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:839: in cat 12:24:58 with _open(fname) as f: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 fname = '/proc/41727/stat' 12:24:58 12:24:58 def open_binary(fname): 12:24:58 > return open(fname, "rb", buffering=FILE_READ_BUFFER_SIZE) 12:24:58 E FileNotFoundError: [Errno 2] No such file or directory: '/proc/41727/stat' 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:799: FileNotFoundError 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 > self.create_time() 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:355: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:757: in create_time 12:24:58 self._create_time = self._proc.create_time() 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: in wrapper 12:24:58 return fun(self, *args, **kwargs) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1948: in create_time 12:24:58 ctime = float(self._parse_stat_file()['create_time']) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 return fun(self, *args, **kwargs) 12:24:58 except PermissionError: 12:24:58 raise AccessDenied(self.pid, self._name) 12:24:58 except ProcessLookupError: 12:24:58 self._raise_if_zombie() 12:24:58 raise NoSuchProcess(self.pid, self._name) 12:24:58 except FileNotFoundError: 12:24:58 self._raise_if_zombie() 12:24:58 if not os.path.exists("%s/%s" % (self._procfs_path, self.pid)): 12:24:58 > raise NoSuchProcess(self.pid, self._name) 12:24:58 E psutil.NoSuchProcess: process no longer exists (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1726: NoSuchProcess 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 cls = 12:24:58 12:24:58 @classmethod 12:24:58 def setUpClass(cls): 12:24:58 # pylint: disable=unsubscriptable-object 12:24:58 cls.init_failed = False 12:24:58 os.environ['JAVA_MIN_MEM'] = '1024M' 12:24:58 os.environ['JAVA_MAX_MEM'] = '4096M' 12:24:58 cls.processes = test_utils.start_tpce() 12:24:58 # TAPI feature is not installed by default in Karaf 12:24:58 if 'NO_ODL_STARTUP' not in os.environ or 'USE_LIGHTY' not in os.environ or os.environ['USE_LIGHTY'] != 'True': 12:24:58 print('installing tapi feature...') 12:24:58 result = test_utils.install_karaf_feature('odl-transportpce-tapi') 12:24:58 if result.returncode != 0: 12:24:58 cls.init_failed = True 12:24:58 print('Restarting OpenDaylight...') 12:24:58 > test_utils.shutdown_process(cls.processes[0]) 12:24:58 12:24:58 transportpce_tests/network/test01_topo_extension.py:149: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 transportpce_tests/common/test_utils.py:270: in shutdown_process 12:24:58 for child in psutil.Process(process.pid).children(): 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:319: in __init__ 12:24:58 self._init(pid) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 self.create_time() 12:24:58 except AccessDenied: 12:24:58 # We should never get here as AFAIK we're able to get 12:24:58 # process creation time on all platforms even as a 12:24:58 # limited user. 12:24:58 pass 12:24:58 except ZombieProcess: 12:24:58 # Zombies can still be queried by this class (although 12:24:58 # not always) and pids() return them so just go on. 12:24:58 pass 12:24:58 except NoSuchProcess: 12:24:58 if not _ignore_nsp: 12:24:58 msg = "process PID not found" 12:24:58 > raise NoSuchProcess(pid, msg=msg) 12:24:58 E psutil.NoSuchProcess: process PID not found (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:368: NoSuchProcess 12:24:58 _ ERROR at setup of TransportPCEtesting.test_11_connect_RDMA1_to_TAPI_EXT_roadmTA1 _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 > return fun(self, *args, **kwargs) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:508: in wrapper 12:24:58 raise raise_from(err, None) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:506: in wrapper 12:24:58 return fun(self) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1780: in _parse_stat_file 12:24:58 data = bcat("%s/%s/stat" % (self._procfs_path, self.pid)) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:851: in bcat 12:24:58 return cat(fname, fallback=fallback, _open=open_binary) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:839: in cat 12:24:58 with _open(fname) as f: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 fname = '/proc/41727/stat' 12:24:58 12:24:58 def open_binary(fname): 12:24:58 > return open(fname, "rb", buffering=FILE_READ_BUFFER_SIZE) 12:24:58 E FileNotFoundError: [Errno 2] No such file or directory: '/proc/41727/stat' 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:799: FileNotFoundError 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 > self.create_time() 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:355: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:757: in create_time 12:24:58 self._create_time = self._proc.create_time() 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: in wrapper 12:24:58 return fun(self, *args, **kwargs) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1948: in create_time 12:24:58 ctime = float(self._parse_stat_file()['create_time']) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 return fun(self, *args, **kwargs) 12:24:58 except PermissionError: 12:24:58 raise AccessDenied(self.pid, self._name) 12:24:58 except ProcessLookupError: 12:24:58 self._raise_if_zombie() 12:24:58 raise NoSuchProcess(self.pid, self._name) 12:24:58 except FileNotFoundError: 12:24:58 self._raise_if_zombie() 12:24:58 if not os.path.exists("%s/%s" % (self._procfs_path, self.pid)): 12:24:58 > raise NoSuchProcess(self.pid, self._name) 12:24:58 E psutil.NoSuchProcess: process no longer exists (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1726: NoSuchProcess 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 cls = 12:24:58 12:24:58 @classmethod 12:24:58 def setUpClass(cls): 12:24:58 # pylint: disable=unsubscriptable-object 12:24:58 cls.init_failed = False 12:24:58 os.environ['JAVA_MIN_MEM'] = '1024M' 12:24:58 os.environ['JAVA_MAX_MEM'] = '4096M' 12:24:58 cls.processes = test_utils.start_tpce() 12:24:58 # TAPI feature is not installed by default in Karaf 12:24:58 if 'NO_ODL_STARTUP' not in os.environ or 'USE_LIGHTY' not in os.environ or os.environ['USE_LIGHTY'] != 'True': 12:24:58 print('installing tapi feature...') 12:24:58 result = test_utils.install_karaf_feature('odl-transportpce-tapi') 12:24:58 if result.returncode != 0: 12:24:58 cls.init_failed = True 12:24:58 print('Restarting OpenDaylight...') 12:24:58 > test_utils.shutdown_process(cls.processes[0]) 12:24:58 12:24:58 transportpce_tests/network/test01_topo_extension.py:149: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 transportpce_tests/common/test_utils.py:270: in shutdown_process 12:24:58 for child in psutil.Process(process.pid).children(): 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:319: in __init__ 12:24:58 self._init(pid) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 self.create_time() 12:24:58 except AccessDenied: 12:24:58 # We should never get here as AFAIK we're able to get 12:24:58 # process creation time on all platforms even as a 12:24:58 # limited user. 12:24:58 pass 12:24:58 except ZombieProcess: 12:24:58 # Zombies can still be queried by this class (although 12:24:58 # not always) and pids() return them so just go on. 12:24:58 pass 12:24:58 except NoSuchProcess: 12:24:58 if not _ignore_nsp: 12:24:58 msg = "process PID not found" 12:24:58 > raise NoSuchProcess(pid, msg=msg) 12:24:58 E psutil.NoSuchProcess: process PID not found (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:368: NoSuchProcess 12:24:58 _ ERROR at setup of TransportPCEtesting.test_12_connect_RDMC1_to_TAPI_EXT_roadmTC1 _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 > return fun(self, *args, **kwargs) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:508: in wrapper 12:24:58 raise raise_from(err, None) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:506: in wrapper 12:24:58 return fun(self) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1780: in _parse_stat_file 12:24:58 data = bcat("%s/%s/stat" % (self._procfs_path, self.pid)) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:851: in bcat 12:24:58 return cat(fname, fallback=fallback, _open=open_binary) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:839: in cat 12:24:58 with _open(fname) as f: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 fname = '/proc/41727/stat' 12:24:58 12:24:58 def open_binary(fname): 12:24:58 > return open(fname, "rb", buffering=FILE_READ_BUFFER_SIZE) 12:24:58 E FileNotFoundError: [Errno 2] No such file or directory: '/proc/41727/stat' 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:799: FileNotFoundError 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 > self.create_time() 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:355: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:757: in create_time 12:24:58 self._create_time = self._proc.create_time() 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: in wrapper 12:24:58 return fun(self, *args, **kwargs) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1948: in create_time 12:24:58 ctime = float(self._parse_stat_file()['create_time']) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 return fun(self, *args, **kwargs) 12:24:58 except PermissionError: 12:24:58 raise AccessDenied(self.pid, self._name) 12:24:58 except ProcessLookupError: 12:24:58 self._raise_if_zombie() 12:24:58 raise NoSuchProcess(self.pid, self._name) 12:24:58 except FileNotFoundError: 12:24:58 self._raise_if_zombie() 12:24:58 if not os.path.exists("%s/%s" % (self._procfs_path, self.pid)): 12:24:58 > raise NoSuchProcess(self.pid, self._name) 12:24:58 E psutil.NoSuchProcess: process no longer exists (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1726: NoSuchProcess 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 cls = 12:24:58 12:24:58 @classmethod 12:24:58 def setUpClass(cls): 12:24:58 # pylint: disable=unsubscriptable-object 12:24:58 cls.init_failed = False 12:24:58 os.environ['JAVA_MIN_MEM'] = '1024M' 12:24:58 os.environ['JAVA_MAX_MEM'] = '4096M' 12:24:58 cls.processes = test_utils.start_tpce() 12:24:58 # TAPI feature is not installed by default in Karaf 12:24:58 if 'NO_ODL_STARTUP' not in os.environ or 'USE_LIGHTY' not in os.environ or os.environ['USE_LIGHTY'] != 'True': 12:24:58 print('installing tapi feature...') 12:24:58 result = test_utils.install_karaf_feature('odl-transportpce-tapi') 12:24:58 if result.returncode != 0: 12:24:58 cls.init_failed = True 12:24:58 print('Restarting OpenDaylight...') 12:24:58 > test_utils.shutdown_process(cls.processes[0]) 12:24:58 12:24:58 transportpce_tests/network/test01_topo_extension.py:149: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 transportpce_tests/common/test_utils.py:270: in shutdown_process 12:24:58 for child in psutil.Process(process.pid).children(): 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:319: in __init__ 12:24:58 self._init(pid) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 self.create_time() 12:24:58 except AccessDenied: 12:24:58 # We should never get here as AFAIK we're able to get 12:24:58 # process creation time on all platforms even as a 12:24:58 # limited user. 12:24:58 pass 12:24:58 except ZombieProcess: 12:24:58 # Zombies can still be queried by this class (although 12:24:58 # not always) and pids() return them so just go on. 12:24:58 pass 12:24:58 except NoSuchProcess: 12:24:58 if not _ignore_nsp: 12:24:58 msg = "process PID not found" 12:24:58 > raise NoSuchProcess(pid, msg=msg) 12:24:58 E psutil.NoSuchProcess: process PID not found (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:368: NoSuchProcess 12:24:58 ___ ERROR at setup of TransportPCEtesting.test_13_getLinks_OpenroadmTopology ___ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 > return fun(self, *args, **kwargs) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:508: in wrapper 12:24:58 raise raise_from(err, None) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:506: in wrapper 12:24:58 return fun(self) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1780: in _parse_stat_file 12:24:58 data = bcat("%s/%s/stat" % (self._procfs_path, self.pid)) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:851: in bcat 12:24:58 return cat(fname, fallback=fallback, _open=open_binary) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:839: in cat 12:24:58 with _open(fname) as f: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 fname = '/proc/41727/stat' 12:24:58 12:24:58 def open_binary(fname): 12:24:58 > return open(fname, "rb", buffering=FILE_READ_BUFFER_SIZE) 12:24:58 E FileNotFoundError: [Errno 2] No such file or directory: '/proc/41727/stat' 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:799: FileNotFoundError 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 > self.create_time() 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:355: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:757: in create_time 12:24:58 self._create_time = self._proc.create_time() 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: in wrapper 12:24:58 return fun(self, *args, **kwargs) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1948: in create_time 12:24:58 ctime = float(self._parse_stat_file()['create_time']) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 return fun(self, *args, **kwargs) 12:24:58 except PermissionError: 12:24:58 raise AccessDenied(self.pid, self._name) 12:24:58 except ProcessLookupError: 12:24:58 self._raise_if_zombie() 12:24:58 raise NoSuchProcess(self.pid, self._name) 12:24:58 except FileNotFoundError: 12:24:58 self._raise_if_zombie() 12:24:58 if not os.path.exists("%s/%s" % (self._procfs_path, self.pid)): 12:24:58 > raise NoSuchProcess(self.pid, self._name) 12:24:58 E psutil.NoSuchProcess: process no longer exists (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1726: NoSuchProcess 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 cls = 12:24:58 12:24:58 @classmethod 12:24:58 def setUpClass(cls): 12:24:58 # pylint: disable=unsubscriptable-object 12:24:58 cls.init_failed = False 12:24:58 os.environ['JAVA_MIN_MEM'] = '1024M' 12:24:58 os.environ['JAVA_MAX_MEM'] = '4096M' 12:24:58 cls.processes = test_utils.start_tpce() 12:24:58 # TAPI feature is not installed by default in Karaf 12:24:58 if 'NO_ODL_STARTUP' not in os.environ or 'USE_LIGHTY' not in os.environ or os.environ['USE_LIGHTY'] != 'True': 12:24:58 print('installing tapi feature...') 12:24:58 result = test_utils.install_karaf_feature('odl-transportpce-tapi') 12:24:58 if result.returncode != 0: 12:24:58 cls.init_failed = True 12:24:58 print('Restarting OpenDaylight...') 12:24:58 > test_utils.shutdown_process(cls.processes[0]) 12:24:58 12:24:58 transportpce_tests/network/test01_topo_extension.py:149: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 transportpce_tests/common/test_utils.py:270: in shutdown_process 12:24:58 for child in psutil.Process(process.pid).children(): 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:319: in __init__ 12:24:58 self._init(pid) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 self.create_time() 12:24:58 except AccessDenied: 12:24:58 # We should never get here as AFAIK we're able to get 12:24:58 # process creation time on all platforms even as a 12:24:58 # limited user. 12:24:58 pass 12:24:58 except ZombieProcess: 12:24:58 # Zombies can still be queried by this class (although 12:24:58 # not always) and pids() return them so just go on. 12:24:58 pass 12:24:58 except NoSuchProcess: 12:24:58 if not _ignore_nsp: 12:24:58 msg = "process PID not found" 12:24:58 > raise NoSuchProcess(pid, msg=msg) 12:24:58 E psutil.NoSuchProcess: process PID not found (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:368: NoSuchProcess 12:24:58 ___ ERROR at setup of TransportPCEtesting.test_14_getNodes_OpenRoadmTopology ___ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 > return fun(self, *args, **kwargs) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:508: in wrapper 12:24:58 raise raise_from(err, None) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:506: in wrapper 12:24:58 return fun(self) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1780: in _parse_stat_file 12:24:58 data = bcat("%s/%s/stat" % (self._procfs_path, self.pid)) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:851: in bcat 12:24:58 return cat(fname, fallback=fallback, _open=open_binary) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:839: in cat 12:24:58 with _open(fname) as f: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 fname = '/proc/41727/stat' 12:24:58 12:24:58 def open_binary(fname): 12:24:58 > return open(fname, "rb", buffering=FILE_READ_BUFFER_SIZE) 12:24:58 E FileNotFoundError: [Errno 2] No such file or directory: '/proc/41727/stat' 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:799: FileNotFoundError 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 > self.create_time() 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:355: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:757: in create_time 12:24:58 self._create_time = self._proc.create_time() 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: in wrapper 12:24:58 return fun(self, *args, **kwargs) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1948: in create_time 12:24:58 ctime = float(self._parse_stat_file()['create_time']) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 return fun(self, *args, **kwargs) 12:24:58 except PermissionError: 12:24:58 raise AccessDenied(self.pid, self._name) 12:24:58 except ProcessLookupError: 12:24:58 self._raise_if_zombie() 12:24:58 raise NoSuchProcess(self.pid, self._name) 12:24:58 except FileNotFoundError: 12:24:58 self._raise_if_zombie() 12:24:58 if not os.path.exists("%s/%s" % (self._procfs_path, self.pid)): 12:24:58 > raise NoSuchProcess(self.pid, self._name) 12:24:58 E psutil.NoSuchProcess: process no longer exists (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1726: NoSuchProcess 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 cls = 12:24:58 12:24:58 @classmethod 12:24:58 def setUpClass(cls): 12:24:58 # pylint: disable=unsubscriptable-object 12:24:58 cls.init_failed = False 12:24:58 os.environ['JAVA_MIN_MEM'] = '1024M' 12:24:58 os.environ['JAVA_MAX_MEM'] = '4096M' 12:24:58 cls.processes = test_utils.start_tpce() 12:24:58 # TAPI feature is not installed by default in Karaf 12:24:58 if 'NO_ODL_STARTUP' not in os.environ or 'USE_LIGHTY' not in os.environ or os.environ['USE_LIGHTY'] != 'True': 12:24:58 print('installing tapi feature...') 12:24:58 result = test_utils.install_karaf_feature('odl-transportpce-tapi') 12:24:58 if result.returncode != 0: 12:24:58 cls.init_failed = True 12:24:58 print('Restarting OpenDaylight...') 12:24:58 > test_utils.shutdown_process(cls.processes[0]) 12:24:58 12:24:58 transportpce_tests/network/test01_topo_extension.py:149: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 transportpce_tests/common/test_utils.py:270: in shutdown_process 12:24:58 for child in psutil.Process(process.pid).children(): 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:319: in __init__ 12:24:58 self._init(pid) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 self.create_time() 12:24:58 except AccessDenied: 12:24:58 # We should never get here as AFAIK we're able to get 12:24:58 # process creation time on all platforms even as a 12:24:58 # limited user. 12:24:58 pass 12:24:58 except ZombieProcess: 12:24:58 # Zombies can still be queried by this class (although 12:24:58 # not always) and pids() return them so just go on. 12:24:58 pass 12:24:58 except NoSuchProcess: 12:24:58 if not _ignore_nsp: 12:24:58 msg = "process PID not found" 12:24:58 > raise NoSuchProcess(pid, msg=msg) 12:24:58 E psutil.NoSuchProcess: process PID not found (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:368: NoSuchProcess 12:24:58 ________ ERROR at setup of TransportPCEtesting.test_15_disconnect_spdrA ________ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 > return fun(self, *args, **kwargs) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:508: in wrapper 12:24:58 raise raise_from(err, None) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:506: in wrapper 12:24:58 return fun(self) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1780: in _parse_stat_file 12:24:58 data = bcat("%s/%s/stat" % (self._procfs_path, self.pid)) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:851: in bcat 12:24:58 return cat(fname, fallback=fallback, _open=open_binary) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:839: in cat 12:24:58 with _open(fname) as f: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 fname = '/proc/41727/stat' 12:24:58 12:24:58 def open_binary(fname): 12:24:58 > return open(fname, "rb", buffering=FILE_READ_BUFFER_SIZE) 12:24:58 E FileNotFoundError: [Errno 2] No such file or directory: '/proc/41727/stat' 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:799: FileNotFoundError 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 > self.create_time() 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:355: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:757: in create_time 12:24:58 self._create_time = self._proc.create_time() 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: in wrapper 12:24:58 return fun(self, *args, **kwargs) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1948: in create_time 12:24:58 ctime = float(self._parse_stat_file()['create_time']) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 return fun(self, *args, **kwargs) 12:24:58 except PermissionError: 12:24:58 raise AccessDenied(self.pid, self._name) 12:24:58 except ProcessLookupError: 12:24:58 self._raise_if_zombie() 12:24:58 raise NoSuchProcess(self.pid, self._name) 12:24:58 except FileNotFoundError: 12:24:58 self._raise_if_zombie() 12:24:58 if not os.path.exists("%s/%s" % (self._procfs_path, self.pid)): 12:24:58 > raise NoSuchProcess(self.pid, self._name) 12:24:58 E psutil.NoSuchProcess: process no longer exists (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1726: NoSuchProcess 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 cls = 12:24:58 12:24:58 @classmethod 12:24:58 def setUpClass(cls): 12:24:58 # pylint: disable=unsubscriptable-object 12:24:58 cls.init_failed = False 12:24:58 os.environ['JAVA_MIN_MEM'] = '1024M' 12:24:58 os.environ['JAVA_MAX_MEM'] = '4096M' 12:24:58 cls.processes = test_utils.start_tpce() 12:24:58 # TAPI feature is not installed by default in Karaf 12:24:58 if 'NO_ODL_STARTUP' not in os.environ or 'USE_LIGHTY' not in os.environ or os.environ['USE_LIGHTY'] != 'True': 12:24:58 print('installing tapi feature...') 12:24:58 result = test_utils.install_karaf_feature('odl-transportpce-tapi') 12:24:58 if result.returncode != 0: 12:24:58 cls.init_failed = True 12:24:58 print('Restarting OpenDaylight...') 12:24:58 > test_utils.shutdown_process(cls.processes[0]) 12:24:58 12:24:58 transportpce_tests/network/test01_topo_extension.py:149: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 transportpce_tests/common/test_utils.py:270: in shutdown_process 12:24:58 for child in psutil.Process(process.pid).children(): 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:319: in __init__ 12:24:58 self._init(pid) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 self.create_time() 12:24:58 except AccessDenied: 12:24:58 # We should never get here as AFAIK we're able to get 12:24:58 # process creation time on all platforms even as a 12:24:58 # limited user. 12:24:58 pass 12:24:58 except ZombieProcess: 12:24:58 # Zombies can still be queried by this class (although 12:24:58 # not always) and pids() return them so just go on. 12:24:58 pass 12:24:58 except NoSuchProcess: 12:24:58 if not _ignore_nsp: 12:24:58 msg = "process PID not found" 12:24:58 > raise NoSuchProcess(pid, msg=msg) 12:24:58 E psutil.NoSuchProcess: process PID not found (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:368: NoSuchProcess 12:24:58 ________ ERROR at setup of TransportPCEtesting.test_16_disconnect_spdrC ________ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 > return fun(self, *args, **kwargs) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:508: in wrapper 12:24:58 raise raise_from(err, None) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:506: in wrapper 12:24:58 return fun(self) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1780: in _parse_stat_file 12:24:58 data = bcat("%s/%s/stat" % (self._procfs_path, self.pid)) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:851: in bcat 12:24:58 return cat(fname, fallback=fallback, _open=open_binary) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:839: in cat 12:24:58 with _open(fname) as f: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 fname = '/proc/41727/stat' 12:24:58 12:24:58 def open_binary(fname): 12:24:58 > return open(fname, "rb", buffering=FILE_READ_BUFFER_SIZE) 12:24:58 E FileNotFoundError: [Errno 2] No such file or directory: '/proc/41727/stat' 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:799: FileNotFoundError 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 > self.create_time() 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:355: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:757: in create_time 12:24:58 self._create_time = self._proc.create_time() 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: in wrapper 12:24:58 return fun(self, *args, **kwargs) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1948: in create_time 12:24:58 ctime = float(self._parse_stat_file()['create_time']) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 return fun(self, *args, **kwargs) 12:24:58 except PermissionError: 12:24:58 raise AccessDenied(self.pid, self._name) 12:24:58 except ProcessLookupError: 12:24:58 self._raise_if_zombie() 12:24:58 raise NoSuchProcess(self.pid, self._name) 12:24:58 except FileNotFoundError: 12:24:58 self._raise_if_zombie() 12:24:58 if not os.path.exists("%s/%s" % (self._procfs_path, self.pid)): 12:24:58 > raise NoSuchProcess(self.pid, self._name) 12:24:58 E psutil.NoSuchProcess: process no longer exists (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1726: NoSuchProcess 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 cls = 12:24:58 12:24:58 @classmethod 12:24:58 def setUpClass(cls): 12:24:58 # pylint: disable=unsubscriptable-object 12:24:58 cls.init_failed = False 12:24:58 os.environ['JAVA_MIN_MEM'] = '1024M' 12:24:58 os.environ['JAVA_MAX_MEM'] = '4096M' 12:24:58 cls.processes = test_utils.start_tpce() 12:24:58 # TAPI feature is not installed by default in Karaf 12:24:58 if 'NO_ODL_STARTUP' not in os.environ or 'USE_LIGHTY' not in os.environ or os.environ['USE_LIGHTY'] != 'True': 12:24:58 print('installing tapi feature...') 12:24:58 result = test_utils.install_karaf_feature('odl-transportpce-tapi') 12:24:58 if result.returncode != 0: 12:24:58 cls.init_failed = True 12:24:58 print('Restarting OpenDaylight...') 12:24:58 > test_utils.shutdown_process(cls.processes[0]) 12:24:58 12:24:58 transportpce_tests/network/test01_topo_extension.py:149: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 transportpce_tests/common/test_utils.py:270: in shutdown_process 12:24:58 for child in psutil.Process(process.pid).children(): 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:319: in __init__ 12:24:58 self._init(pid) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 self.create_time() 12:24:58 except AccessDenied: 12:24:58 # We should never get here as AFAIK we're able to get 12:24:58 # process creation time on all platforms even as a 12:24:58 # limited user. 12:24:58 pass 12:24:58 except ZombieProcess: 12:24:58 # Zombies can still be queried by this class (although 12:24:58 # not always) and pids() return them so just go on. 12:24:58 pass 12:24:58 except NoSuchProcess: 12:24:58 if not _ignore_nsp: 12:24:58 msg = "process PID not found" 12:24:58 > raise NoSuchProcess(pid, msg=msg) 12:24:58 E psutil.NoSuchProcess: process PID not found (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:368: NoSuchProcess 12:24:58 _______ ERROR at setup of TransportPCEtesting.test_17_disconnect_roadmA ________ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 > return fun(self, *args, **kwargs) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:508: in wrapper 12:24:58 raise raise_from(err, None) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:506: in wrapper 12:24:58 return fun(self) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1780: in _parse_stat_file 12:24:58 data = bcat("%s/%s/stat" % (self._procfs_path, self.pid)) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:851: in bcat 12:24:58 return cat(fname, fallback=fallback, _open=open_binary) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:839: in cat 12:24:58 with _open(fname) as f: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 fname = '/proc/41727/stat' 12:24:58 12:24:58 def open_binary(fname): 12:24:58 > return open(fname, "rb", buffering=FILE_READ_BUFFER_SIZE) 12:24:58 E FileNotFoundError: [Errno 2] No such file or directory: '/proc/41727/stat' 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:799: FileNotFoundError 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 > self.create_time() 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:355: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:757: in create_time 12:24:58 self._create_time = self._proc.create_time() 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: in wrapper 12:24:58 return fun(self, *args, **kwargs) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1948: in create_time 12:24:58 ctime = float(self._parse_stat_file()['create_time']) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 return fun(self, *args, **kwargs) 12:24:58 except PermissionError: 12:24:58 raise AccessDenied(self.pid, self._name) 12:24:58 except ProcessLookupError: 12:24:58 self._raise_if_zombie() 12:24:58 raise NoSuchProcess(self.pid, self._name) 12:24:58 except FileNotFoundError: 12:24:58 self._raise_if_zombie() 12:24:58 if not os.path.exists("%s/%s" % (self._procfs_path, self.pid)): 12:24:58 > raise NoSuchProcess(self.pid, self._name) 12:24:58 E psutil.NoSuchProcess: process no longer exists (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1726: NoSuchProcess 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 cls = 12:24:58 12:24:58 @classmethod 12:24:58 def setUpClass(cls): 12:24:58 # pylint: disable=unsubscriptable-object 12:24:58 cls.init_failed = False 12:24:58 os.environ['JAVA_MIN_MEM'] = '1024M' 12:24:58 os.environ['JAVA_MAX_MEM'] = '4096M' 12:24:58 cls.processes = test_utils.start_tpce() 12:24:58 # TAPI feature is not installed by default in Karaf 12:24:58 if 'NO_ODL_STARTUP' not in os.environ or 'USE_LIGHTY' not in os.environ or os.environ['USE_LIGHTY'] != 'True': 12:24:58 print('installing tapi feature...') 12:24:58 result = test_utils.install_karaf_feature('odl-transportpce-tapi') 12:24:58 if result.returncode != 0: 12:24:58 cls.init_failed = True 12:24:58 print('Restarting OpenDaylight...') 12:24:58 > test_utils.shutdown_process(cls.processes[0]) 12:24:58 12:24:58 transportpce_tests/network/test01_topo_extension.py:149: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 transportpce_tests/common/test_utils.py:270: in shutdown_process 12:24:58 for child in psutil.Process(process.pid).children(): 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:319: in __init__ 12:24:58 self._init(pid) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 self.create_time() 12:24:58 except AccessDenied: 12:24:58 # We should never get here as AFAIK we're able to get 12:24:58 # process creation time on all platforms even as a 12:24:58 # limited user. 12:24:58 pass 12:24:58 except ZombieProcess: 12:24:58 # Zombies can still be queried by this class (although 12:24:58 # not always) and pids() return them so just go on. 12:24:58 pass 12:24:58 except NoSuchProcess: 12:24:58 if not _ignore_nsp: 12:24:58 msg = "process PID not found" 12:24:58 > raise NoSuchProcess(pid, msg=msg) 12:24:58 E psutil.NoSuchProcess: process PID not found (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:368: NoSuchProcess 12:24:58 _______ ERROR at setup of TransportPCEtesting.test_18_disconnect_roadmC ________ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 > return fun(self, *args, **kwargs) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:508: in wrapper 12:24:58 raise raise_from(err, None) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:506: in wrapper 12:24:58 return fun(self) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1780: in _parse_stat_file 12:24:58 data = bcat("%s/%s/stat" % (self._procfs_path, self.pid)) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:851: in bcat 12:24:58 return cat(fname, fallback=fallback, _open=open_binary) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:839: in cat 12:24:58 with _open(fname) as f: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 fname = '/proc/41727/stat' 12:24:58 12:24:58 def open_binary(fname): 12:24:58 > return open(fname, "rb", buffering=FILE_READ_BUFFER_SIZE) 12:24:58 E FileNotFoundError: [Errno 2] No such file or directory: '/proc/41727/stat' 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_common.py:799: FileNotFoundError 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 > self.create_time() 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:355: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:757: in create_time 12:24:58 self._create_time = self._proc.create_time() 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1717: in wrapper 12:24:58 return fun(self, *args, **kwargs) 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1948: in create_time 12:24:58 ctime = float(self._parse_stat_file()['create_time']) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = , args = () 12:24:58 kwargs = {} 12:24:58 12:24:58 @functools.wraps(fun) 12:24:58 def wrapper(self, *args, **kwargs): 12:24:58 try: 12:24:58 return fun(self, *args, **kwargs) 12:24:58 except PermissionError: 12:24:58 raise AccessDenied(self.pid, self._name) 12:24:58 except ProcessLookupError: 12:24:58 self._raise_if_zombie() 12:24:58 raise NoSuchProcess(self.pid, self._name) 12:24:58 except FileNotFoundError: 12:24:58 self._raise_if_zombie() 12:24:58 if not os.path.exists("%s/%s" % (self._procfs_path, self.pid)): 12:24:58 > raise NoSuchProcess(self.pid, self._name) 12:24:58 E psutil.NoSuchProcess: process no longer exists (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/_pslinux.py:1726: NoSuchProcess 12:24:58 12:24:58 During handling of the above exception, another exception occurred: 12:24:58 12:24:58 cls = 12:24:58 12:24:58 @classmethod 12:24:58 def setUpClass(cls): 12:24:58 # pylint: disable=unsubscriptable-object 12:24:58 cls.init_failed = False 12:24:58 os.environ['JAVA_MIN_MEM'] = '1024M' 12:24:58 os.environ['JAVA_MAX_MEM'] = '4096M' 12:24:58 cls.processes = test_utils.start_tpce() 12:24:58 # TAPI feature is not installed by default in Karaf 12:24:58 if 'NO_ODL_STARTUP' not in os.environ or 'USE_LIGHTY' not in os.environ or os.environ['USE_LIGHTY'] != 'True': 12:24:58 print('installing tapi feature...') 12:24:58 result = test_utils.install_karaf_feature('odl-transportpce-tapi') 12:24:58 if result.returncode != 0: 12:24:58 cls.init_failed = True 12:24:58 print('Restarting OpenDaylight...') 12:24:58 > test_utils.shutdown_process(cls.processes[0]) 12:24:58 12:24:58 transportpce_tests/network/test01_topo_extension.py:149: 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 transportpce_tests/common/test_utils.py:270: in shutdown_process 12:24:58 for child in psutil.Process(process.pid).children(): 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:319: in __init__ 12:24:58 self._init(pid) 12:24:58 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:24:58 12:24:58 self = psutil.Process(pid=41727, status='terminated'), pid = 41727 12:24:58 _ignore_nsp = False 12:24:58 12:24:58 def _init(self, pid, _ignore_nsp=False): 12:24:58 if pid is None: 12:24:58 pid = os.getpid() 12:24:58 else: 12:24:58 if not _PY3 and not isinstance(pid, (int, long)): 12:24:58 msg = "pid must be an integer (got %r)" % pid 12:24:58 raise TypeError(msg) 12:24:58 if pid < 0: 12:24:58 msg = "pid must be a positive integer (got %s)" % pid 12:24:58 raise ValueError(msg) 12:24:58 try: 12:24:58 _psplatform.cext.check_pid_range(pid) 12:24:58 except OverflowError: 12:24:58 msg = "process PID out of range (got %s)" % pid 12:24:58 raise NoSuchProcess(pid, msg=msg) 12:24:58 12:24:58 self._pid = pid 12:24:58 self._name = None 12:24:58 self._exe = None 12:24:58 self._create_time = None 12:24:58 self._gone = False 12:24:58 self._pid_reused = False 12:24:58 self._hash = None 12:24:58 self._lock = threading.RLock() 12:24:58 # used for caching on Windows only (on POSIX ppid may change) 12:24:58 self._ppid = None 12:24:58 # platform-specific modules define an _psplatform.Process 12:24:58 # implementation class 12:24:58 self._proc = _psplatform.Process(pid) 12:24:58 self._last_sys_cpu_times = None 12:24:58 self._last_proc_cpu_times = None 12:24:58 self._exitcode = _SENTINEL 12:24:58 # cache creation time for later use in is_running() method 12:24:58 try: 12:24:58 self.create_time() 12:24:58 except AccessDenied: 12:24:58 # We should never get here as AFAIK we're able to get 12:24:58 # process creation time on all platforms even as a 12:24:58 # limited user. 12:24:58 pass 12:24:58 except ZombieProcess: 12:24:58 # Zombies can still be queried by this class (although 12:24:58 # not always) and pids() return them so just go on. 12:24:58 pass 12:24:58 except NoSuchProcess: 12:24:58 if not _ignore_nsp: 12:24:58 msg = "process PID not found" 12:24:58 > raise NoSuchProcess(pid, msg=msg) 12:24:58 E psutil.NoSuchProcess: process PID not found (pid=41727) 12:24:58 12:24:58 ../.tox/tests_network/lib/python3.11/site-packages/psutil/__init__.py:368: NoSuchProcess 12:24:58 =========================== short test summary info ============================ 12:24:58 ERROR transportpce_tests/network/test01_topo_extension.py::TransportPCEtesting::test_01_connect_spdrA 12:24:58 ERROR transportpce_tests/network/test01_topo_extension.py::TransportPCEtesting::test_02_connect_spdrC 12:24:58 ERROR transportpce_tests/network/test01_topo_extension.py::TransportPCEtesting::test_03_connect_rdmA 12:24:58 ERROR transportpce_tests/network/test01_topo_extension.py::TransportPCEtesting::test_04_connect_rdmC 12:24:58 ERROR transportpce_tests/network/test01_topo_extension.py::TransportPCEtesting::test_05_connect_sprdA_1_N1_to_TAPI_EXT_roadmTA1_PP1 12:24:58 ERROR transportpce_tests/network/test01_topo_extension.py::TransportPCEtesting::test_06_connect_TAPI_EXT_roadmTA1_PP1_to_spdrA_1_N1 12:24:58 ERROR transportpce_tests/network/test01_topo_extension.py::TransportPCEtesting::test_07_connect_sprdC_1_N1_to_TAPI_EXT_roadmTC1_PP1 12:24:58 ERROR transportpce_tests/network/test01_topo_extension.py::TransportPCEtesting::test_08_connect_TAPI_EXT_roadmTC1_PP1_to_spdrC_1_N1 12:24:58 ERROR transportpce_tests/network/test01_topo_extension.py::TransportPCEtesting::test_09_check_otn_topology 12:24:58 ERROR transportpce_tests/network/test01_topo_extension.py::TransportPCEtesting::test_10_check_openroadm_topology 12:24:58 ERROR transportpce_tests/network/test01_topo_extension.py::TransportPCEtesting::test_11_connect_RDMA1_to_TAPI_EXT_roadmTA1 12:24:58 ERROR transportpce_tests/network/test01_topo_extension.py::TransportPCEtesting::test_12_connect_RDMC1_to_TAPI_EXT_roadmTC1 12:24:58 ERROR transportpce_tests/network/test01_topo_extension.py::TransportPCEtesting::test_13_getLinks_OpenroadmTopology 12:24:58 ERROR transportpce_tests/network/test01_topo_extension.py::TransportPCEtesting::test_14_getNodes_OpenRoadmTopology 12:24:58 ERROR transportpce_tests/network/test01_topo_extension.py::TransportPCEtesting::test_15_disconnect_spdrA 12:24:58 ERROR transportpce_tests/network/test01_topo_extension.py::TransportPCEtesting::test_16_disconnect_spdrC 12:24:58 ERROR transportpce_tests/network/test01_topo_extension.py::TransportPCEtesting::test_17_disconnect_roadmA 12:24:58 ERROR transportpce_tests/network/test01_topo_extension.py::TransportPCEtesting::test_18_disconnect_roadmC 12:24:58 18 errors in 164.08s (0:02:44) 12:24:58 tests_network: exit 1 (164.31 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh network pid=40972 12:25:18 ............................................ [100%] 12:26:51 44 passed in 151.58s (0:02:31) 12:26:51 pytest -q transportpce_tests/2.2.1/test04_otn_topology.py 12:28:32 EEEEEEEEEEEE [100%] 12:46:52 ==================================== ERRORS ==================================== 12:46:52 ________ ERROR at setup of TransportPCEtesting.test_01_connect_SPDR_SA1 ________ 12:46:52 12:46:52 cls = 12:46:52 12:46:52 @classmethod 12:46:52 def setUpClass(cls): 12:46:52 > cls.processes = test_utils.start_tpce() 12:46:52 12:46:52 transportpce_tests/2.2.1/test04_otn_topology.py:34: 12:46:52 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:46:52 12:46:52 def start_tpce(): 12:46:52 if 'NO_ODL_STARTUP' in os.environ: 12:46:52 print('No OpenDaylight instance to start!') 12:46:52 return [] 12:46:52 print('starting OpenDaylight...') 12:46:52 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:46:52 process = start_lighty() 12:46:52 start_msg = LIGHTY_OK_START_MSG 12:46:52 else: 12:46:52 process = start_karaf() 12:46:52 start_msg = KARAF_OK_START_MSG 12:46:52 if wait_until_log_contains(TPCE_LOG, start_msg, time_to_wait=100): 12:46:52 print('OpenDaylight started !') 12:46:52 else: 12:46:52 print('OpenDaylight failed to start !') 12:46:52 shutdown_process(process) 12:46:52 for pid in process_list: 12:46:52 shutdown_process(pid) 12:46:52 > sys.exit(1) 12:46:52 E SystemExit: 1 12:46:52 12:46:52 transportpce_tests/common/test_utils.py:229: SystemExit 12:46:52 ---------------------------- Captured stdout setup ----------------------------- 12:46:52 starting OpenDaylight... 12:46:52 starting KARAF TransportPCE build... 12:46:52 Searching for pattern 'Transportpce controller started' in karaf.log... Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:46:52 _________ ERROR at setup of TransportPCEtesting.test_02_getClliNetwork _________ 12:46:52 12:46:52 cls = 12:46:52 12:46:52 @classmethod 12:46:52 def setUpClass(cls): 12:46:52 > cls.processes = test_utils.start_tpce() 12:46:52 12:46:52 transportpce_tests/2.2.1/test04_otn_topology.py:34: 12:46:52 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:46:52 12:46:52 def start_tpce(): 12:46:52 if 'NO_ODL_STARTUP' in os.environ: 12:46:52 print('No OpenDaylight instance to start!') 12:46:52 return [] 12:46:52 print('starting OpenDaylight...') 12:46:52 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:46:52 process = start_lighty() 12:46:52 start_msg = LIGHTY_OK_START_MSG 12:46:52 else: 12:46:52 process = start_karaf() 12:46:52 start_msg = KARAF_OK_START_MSG 12:46:52 if wait_until_log_contains(TPCE_LOG, start_msg, time_to_wait=100): 12:46:52 print('OpenDaylight started !') 12:46:52 else: 12:46:52 print('OpenDaylight failed to start !') 12:46:52 shutdown_process(process) 12:46:52 for pid in process_list: 12:46:52 shutdown_process(pid) 12:46:52 > sys.exit(1) 12:46:52 E SystemExit: 1 12:46:52 12:46:52 transportpce_tests/common/test_utils.py:229: SystemExit 12:46:52 ---------------------------- Captured stdout setup ----------------------------- 12:46:52 starting OpenDaylight... 12:46:52 starting KARAF TransportPCE build... 12:46:52 Searching for pattern 'Transportpce controller started' in karaf.log... Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:46:52 ______ ERROR at setup of TransportPCEtesting.test_03_getOpenRoadmNetwork _______ 12:46:52 12:46:52 cls = 12:46:52 12:46:52 @classmethod 12:46:52 def setUpClass(cls): 12:46:52 > cls.processes = test_utils.start_tpce() 12:46:52 12:46:52 transportpce_tests/2.2.1/test04_otn_topology.py:34: 12:46:52 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:46:52 12:46:52 def start_tpce(): 12:46:52 if 'NO_ODL_STARTUP' in os.environ: 12:46:52 print('No OpenDaylight instance to start!') 12:46:52 return [] 12:46:52 print('starting OpenDaylight...') 12:46:52 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:46:52 process = start_lighty() 12:46:52 start_msg = LIGHTY_OK_START_MSG 12:46:52 else: 12:46:52 process = start_karaf() 12:46:52 start_msg = KARAF_OK_START_MSG 12:46:52 if wait_until_log_contains(TPCE_LOG, start_msg, time_to_wait=100): 12:46:52 print('OpenDaylight started !') 12:46:52 else: 12:46:52 print('OpenDaylight failed to start !') 12:46:52 shutdown_process(process) 12:46:52 for pid in process_list: 12:46:52 shutdown_process(pid) 12:46:52 > sys.exit(1) 12:46:52 E SystemExit: 1 12:46:52 12:46:52 transportpce_tests/common/test_utils.py:229: SystemExit 12:46:52 ---------------------------- Captured stdout setup ----------------------------- 12:46:52 starting OpenDaylight... 12:46:52 starting KARAF TransportPCE build... 12:46:52 Searching for pattern 'Transportpce controller started' in karaf.log... Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:46:52 ___ ERROR at setup of TransportPCEtesting.test_04_getLinks_OpenroadmTopology ___ 12:46:52 12:46:52 cls = 12:46:52 12:46:52 @classmethod 12:46:52 def setUpClass(cls): 12:46:52 > cls.processes = test_utils.start_tpce() 12:46:52 12:46:52 transportpce_tests/2.2.1/test04_otn_topology.py:34: 12:46:52 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:46:52 12:46:52 def start_tpce(): 12:46:52 if 'NO_ODL_STARTUP' in os.environ: 12:46:52 print('No OpenDaylight instance to start!') 12:46:52 return [] 12:46:52 print('starting OpenDaylight...') 12:46:52 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:46:52 process = start_lighty() 12:46:52 start_msg = LIGHTY_OK_START_MSG 12:46:52 else: 12:46:52 process = start_karaf() 12:46:52 start_msg = KARAF_OK_START_MSG 12:46:52 if wait_until_log_contains(TPCE_LOG, start_msg, time_to_wait=100): 12:46:52 print('OpenDaylight started !') 12:46:52 else: 12:46:52 print('OpenDaylight failed to start !') 12:46:52 shutdown_process(process) 12:46:52 for pid in process_list: 12:46:52 shutdown_process(pid) 12:46:52 > sys.exit(1) 12:46:52 E SystemExit: 1 12:46:52 12:46:52 transportpce_tests/common/test_utils.py:229: SystemExit 12:46:52 ---------------------------- Captured stdout setup ----------------------------- 12:46:52 starting OpenDaylight... 12:46:52 starting KARAF TransportPCE build... 12:46:52 Searching for pattern 'Transportpce controller started' in karaf.log... Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:46:52 ___ ERROR at setup of TransportPCEtesting.test_05_getNodes_OpenRoadmTopology ___ 12:46:52 12:46:52 cls = 12:46:52 12:46:52 @classmethod 12:46:52 def setUpClass(cls): 12:46:52 > cls.processes = test_utils.start_tpce() 12:46:52 12:46:52 transportpce_tests/2.2.1/test04_otn_topology.py:34: 12:46:52 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:46:52 12:46:52 def start_tpce(): 12:46:52 if 'NO_ODL_STARTUP' in os.environ: 12:46:52 print('No OpenDaylight instance to start!') 12:46:52 return [] 12:46:52 print('starting OpenDaylight...') 12:46:52 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:46:52 process = start_lighty() 12:46:52 start_msg = LIGHTY_OK_START_MSG 12:46:52 else: 12:46:52 process = start_karaf() 12:46:52 start_msg = KARAF_OK_START_MSG 12:46:52 if wait_until_log_contains(TPCE_LOG, start_msg, time_to_wait=100): 12:46:52 print('OpenDaylight started !') 12:46:52 else: 12:46:52 print('OpenDaylight failed to start !') 12:46:52 shutdown_process(process) 12:46:52 for pid in process_list: 12:46:52 shutdown_process(pid) 12:46:52 > sys.exit(1) 12:46:52 E SystemExit: 1 12:46:52 12:46:52 transportpce_tests/common/test_utils.py:229: SystemExit 12:46:52 ---------------------------- Captured stdout setup ----------------------------- 12:46:52 starting OpenDaylight... 12:46:52 starting KARAF TransportPCE build... 12:46:52 Searching for pattern 'Transportpce controller started' in karaf.log... Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:46:52 ______ ERROR at setup of TransportPCEtesting.test_06_getLinks_OtnTopology ______ 12:46:52 12:46:52 cls = 12:46:52 12:46:52 @classmethod 12:46:52 def setUpClass(cls): 12:46:52 > cls.processes = test_utils.start_tpce() 12:46:52 12:46:52 transportpce_tests/2.2.1/test04_otn_topology.py:34: 12:46:52 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:46:52 12:46:52 def start_tpce(): 12:46:52 if 'NO_ODL_STARTUP' in os.environ: 12:46:52 print('No OpenDaylight instance to start!') 12:46:52 return [] 12:46:52 print('starting OpenDaylight...') 12:46:52 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:46:52 process = start_lighty() 12:46:52 start_msg = LIGHTY_OK_START_MSG 12:46:52 else: 12:46:52 process = start_karaf() 12:46:52 start_msg = KARAF_OK_START_MSG 12:46:52 if wait_until_log_contains(TPCE_LOG, start_msg, time_to_wait=100): 12:46:52 print('OpenDaylight started !') 12:46:52 else: 12:46:52 print('OpenDaylight failed to start !') 12:46:52 shutdown_process(process) 12:46:52 for pid in process_list: 12:46:52 shutdown_process(pid) 12:46:52 > sys.exit(1) 12:46:52 E SystemExit: 1 12:46:52 12:46:52 transportpce_tests/common/test_utils.py:229: SystemExit 12:46:52 ---------------------------- Captured stdout setup ----------------------------- 12:46:52 starting OpenDaylight... 12:46:52 starting KARAF TransportPCE build... 12:46:52 Searching for pattern 'Transportpce controller started' in karaf.log... Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:46:52 ______ ERROR at setup of TransportPCEtesting.test_07_getNodes_OtnTopology ______ 12:46:52 12:46:52 cls = 12:46:52 12:46:52 @classmethod 12:46:52 def setUpClass(cls): 12:46:52 > cls.processes = test_utils.start_tpce() 12:46:52 12:46:52 transportpce_tests/2.2.1/test04_otn_topology.py:34: 12:46:52 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:46:52 12:46:52 def start_tpce(): 12:46:52 if 'NO_ODL_STARTUP' in os.environ: 12:46:52 print('No OpenDaylight instance to start!') 12:46:52 return [] 12:46:52 print('starting OpenDaylight...') 12:46:52 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:46:52 process = start_lighty() 12:46:52 start_msg = LIGHTY_OK_START_MSG 12:46:52 else: 12:46:52 process = start_karaf() 12:46:52 start_msg = KARAF_OK_START_MSG 12:46:52 if wait_until_log_contains(TPCE_LOG, start_msg, time_to_wait=100): 12:46:52 print('OpenDaylight started !') 12:46:52 else: 12:46:52 print('OpenDaylight failed to start !') 12:46:52 shutdown_process(process) 12:46:52 for pid in process_list: 12:46:52 shutdown_process(pid) 12:46:52 > sys.exit(1) 12:46:52 E SystemExit: 1 12:46:52 12:46:52 transportpce_tests/common/test_utils.py:229: SystemExit 12:46:52 ---------------------------- Captured stdout setup ----------------------------- 12:46:52 starting OpenDaylight... 12:46:52 starting KARAF TransportPCE build... 12:46:52 Searching for pattern 'Transportpce controller started' in karaf.log... Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:46:52 ______ ERROR at setup of TransportPCEtesting.test_08_disconnect_SPDR_SA1 _______ 12:46:52 12:46:52 cls = 12:46:52 12:46:52 @classmethod 12:46:52 def setUpClass(cls): 12:46:52 > cls.processes = test_utils.start_tpce() 12:46:52 12:46:52 transportpce_tests/2.2.1/test04_otn_topology.py:34: 12:46:52 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:46:52 12:46:52 def start_tpce(): 12:46:52 if 'NO_ODL_STARTUP' in os.environ: 12:46:52 print('No OpenDaylight instance to start!') 12:46:52 return [] 12:46:52 print('starting OpenDaylight...') 12:46:52 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:46:52 process = start_lighty() 12:46:52 start_msg = LIGHTY_OK_START_MSG 12:46:52 else: 12:46:52 process = start_karaf() 12:46:52 start_msg = KARAF_OK_START_MSG 12:46:52 if wait_until_log_contains(TPCE_LOG, start_msg, time_to_wait=100): 12:46:52 print('OpenDaylight started !') 12:46:52 else: 12:46:52 print('OpenDaylight failed to start !') 12:46:52 shutdown_process(process) 12:46:52 for pid in process_list: 12:46:52 shutdown_process(pid) 12:46:52 > sys.exit(1) 12:46:52 E SystemExit: 1 12:46:52 12:46:52 transportpce_tests/common/test_utils.py:229: SystemExit 12:46:52 ---------------------------- Captured stdout setup ----------------------------- 12:46:52 starting OpenDaylight... 12:46:52 starting KARAF TransportPCE build... 12:46:52 Searching for pattern 'Transportpce controller started' in karaf.log... Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:46:52 _________ ERROR at setup of TransportPCEtesting.test_09_getClliNetwork _________ 12:46:52 12:46:52 cls = 12:46:52 12:46:52 @classmethod 12:46:52 def setUpClass(cls): 12:46:52 > cls.processes = test_utils.start_tpce() 12:46:52 12:46:52 transportpce_tests/2.2.1/test04_otn_topology.py:34: 12:46:52 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:46:52 12:46:52 def start_tpce(): 12:46:52 if 'NO_ODL_STARTUP' in os.environ: 12:46:52 print('No OpenDaylight instance to start!') 12:46:52 return [] 12:46:52 print('starting OpenDaylight...') 12:46:52 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:46:52 process = start_lighty() 12:46:52 start_msg = LIGHTY_OK_START_MSG 12:46:52 else: 12:46:52 process = start_karaf() 12:46:52 start_msg = KARAF_OK_START_MSG 12:46:52 if wait_until_log_contains(TPCE_LOG, start_msg, time_to_wait=100): 12:46:52 print('OpenDaylight started !') 12:46:52 else: 12:46:52 print('OpenDaylight failed to start !') 12:46:52 shutdown_process(process) 12:46:52 for pid in process_list: 12:46:52 shutdown_process(pid) 12:46:52 > sys.exit(1) 12:46:52 E SystemExit: 1 12:46:52 12:46:52 transportpce_tests/common/test_utils.py:229: SystemExit 12:46:52 ---------------------------- Captured stdout setup ----------------------------- 12:46:52 starting OpenDaylight... 12:46:52 starting KARAF TransportPCE build... 12:46:52 Searching for pattern 'Transportpce controller started' in karaf.log... Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:46:52 ______ ERROR at setup of TransportPCEtesting.test_10_getOpenRoadmNetwork _______ 12:46:52 12:46:52 cls = 12:46:52 12:46:52 @classmethod 12:46:52 def setUpClass(cls): 12:46:52 > cls.processes = test_utils.start_tpce() 12:46:52 12:46:52 transportpce_tests/2.2.1/test04_otn_topology.py:34: 12:46:52 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:46:52 12:46:52 def start_tpce(): 12:46:52 if 'NO_ODL_STARTUP' in os.environ: 12:46:52 print('No OpenDaylight instance to start!') 12:46:52 return [] 12:46:52 print('starting OpenDaylight...') 12:46:52 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:46:52 process = start_lighty() 12:46:52 start_msg = LIGHTY_OK_START_MSG 12:46:52 else: 12:46:52 process = start_karaf() 12:46:52 start_msg = KARAF_OK_START_MSG 12:46:52 if wait_until_log_contains(TPCE_LOG, start_msg, time_to_wait=100): 12:46:52 print('OpenDaylight started !') 12:46:52 else: 12:46:52 print('OpenDaylight failed to start !') 12:46:52 shutdown_process(process) 12:46:52 for pid in process_list: 12:46:52 shutdown_process(pid) 12:46:52 > sys.exit(1) 12:46:52 E SystemExit: 1 12:46:52 12:46:52 transportpce_tests/common/test_utils.py:229: SystemExit 12:46:52 ---------------------------- Captured stdout setup ----------------------------- 12:46:52 starting OpenDaylight... 12:46:52 starting KARAF TransportPCE build... 12:46:52 Searching for pattern 'Transportpce controller started' in karaf.log... Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:46:52 ___ ERROR at setup of TransportPCEtesting.test_11_getNodes_OpenRoadmTopology ___ 12:46:52 12:46:52 cls = 12:46:52 12:46:52 @classmethod 12:46:52 def setUpClass(cls): 12:46:52 > cls.processes = test_utils.start_tpce() 12:46:52 12:46:52 transportpce_tests/2.2.1/test04_otn_topology.py:34: 12:46:52 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:46:52 12:46:52 def start_tpce(): 12:46:52 if 'NO_ODL_STARTUP' in os.environ: 12:46:52 print('No OpenDaylight instance to start!') 12:46:52 return [] 12:46:52 print('starting OpenDaylight...') 12:46:52 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:46:52 process = start_lighty() 12:46:52 start_msg = LIGHTY_OK_START_MSG 12:46:52 else: 12:46:52 process = start_karaf() 12:46:52 start_msg = KARAF_OK_START_MSG 12:46:52 if wait_until_log_contains(TPCE_LOG, start_msg, time_to_wait=100): 12:46:52 print('OpenDaylight started !') 12:46:52 else: 12:46:52 print('OpenDaylight failed to start !') 12:46:52 shutdown_process(process) 12:46:52 for pid in process_list: 12:46:52 shutdown_process(pid) 12:46:52 > sys.exit(1) 12:46:52 E SystemExit: 1 12:46:52 12:46:52 transportpce_tests/common/test_utils.py:229: SystemExit 12:46:52 ---------------------------- Captured stdout setup ----------------------------- 12:46:52 starting OpenDaylight... 12:46:52 starting KARAF TransportPCE build... 12:46:52 Searching for pattern 'Transportpce controller started' in karaf.log... Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:46:52 ______ ERROR at setup of TransportPCEtesting.test_12_getNodes_OtnTopology ______ 12:46:52 12:46:52 cls = 12:46:52 12:46:52 @classmethod 12:46:52 def setUpClass(cls): 12:46:52 > cls.processes = test_utils.start_tpce() 12:46:52 12:46:52 transportpce_tests/2.2.1/test04_otn_topology.py:34: 12:46:52 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:46:52 12:46:52 def start_tpce(): 12:46:52 if 'NO_ODL_STARTUP' in os.environ: 12:46:52 print('No OpenDaylight instance to start!') 12:46:52 return [] 12:46:52 print('starting OpenDaylight...') 12:46:52 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:46:52 process = start_lighty() 12:46:52 start_msg = LIGHTY_OK_START_MSG 12:46:52 else: 12:46:52 process = start_karaf() 12:46:52 start_msg = KARAF_OK_START_MSG 12:46:52 if wait_until_log_contains(TPCE_LOG, start_msg, time_to_wait=100): 12:46:52 print('OpenDaylight started !') 12:46:52 else: 12:46:52 print('OpenDaylight failed to start !') 12:46:52 shutdown_process(process) 12:46:52 for pid in process_list: 12:46:52 shutdown_process(pid) 12:46:52 > sys.exit(1) 12:46:52 E SystemExit: 1 12:46:52 12:46:52 transportpce_tests/common/test_utils.py:229: SystemExit 12:46:52 ---------------------------- Captured stdout setup ----------------------------- 12:46:52 starting OpenDaylight... 12:46:52 starting KARAF TransportPCE build... 12:46:52 Searching for pattern 'Transportpce controller started' in karaf.log... Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:46:52 =========================== short test summary info ============================ 12:46:52 ERROR transportpce_tests/2.2.1/test04_otn_topology.py::TransportPCEtesting::test_01_connect_SPDR_SA1 12:46:52 ERROR transportpce_tests/2.2.1/test04_otn_topology.py::TransportPCEtesting::test_02_getClliNetwork 12:46:52 ERROR transportpce_tests/2.2.1/test04_otn_topology.py::TransportPCEtesting::test_03_getOpenRoadmNetwork 12:46:52 ERROR transportpce_tests/2.2.1/test04_otn_topology.py::TransportPCEtesting::test_04_getLinks_OpenroadmTopology 12:46:52 ERROR transportpce_tests/2.2.1/test04_otn_topology.py::TransportPCEtesting::test_05_getNodes_OpenRoadmTopology 12:46:52 ERROR transportpce_tests/2.2.1/test04_otn_topology.py::TransportPCEtesting::test_06_getLinks_OtnTopology 12:46:52 ERROR transportpce_tests/2.2.1/test04_otn_topology.py::TransportPCEtesting::test_07_getNodes_OtnTopology 12:46:52 ERROR transportpce_tests/2.2.1/test04_otn_topology.py::TransportPCEtesting::test_08_disconnect_SPDR_SA1 12:46:52 ERROR transportpce_tests/2.2.1/test04_otn_topology.py::TransportPCEtesting::test_09_getClliNetwork 12:46:52 ERROR transportpce_tests/2.2.1/test04_otn_topology.py::TransportPCEtesting::test_10_getOpenRoadmNetwork 12:46:52 ERROR transportpce_tests/2.2.1/test04_otn_topology.py::TransportPCEtesting::test_11_getNodes_OpenRoadmTopology 12:46:52 ERROR transportpce_tests/2.2.1/test04_otn_topology.py::TransportPCEtesting::test_12_getNodes_OtnTopology 12:46:52 12 errors in 1200.57s (0:20:00) 12:46:52 tests_network: FAIL ✖ in 2 minutes 50.53 seconds 12:46:52 tests221: exit 1 (1478.58 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 2.2.1 pid=40964 12:46:52 tests221: FAIL ✖ in 24 minutes 44.8 seconds 12:46:52 tests_hybrid: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 12:46:58 tests_hybrid: freeze> python -m pip freeze --all 12:46:58 tests_hybrid: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 12:46:58 tests_hybrid: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh hybrid 12:46:58 using environment variables from ./karaf121.env 12:46:58 pytest -q transportpce_tests/hybrid/test01_device_change_notifications.py 12:50:38 ........FF......................................... [100%] 13:13:18 =================================== FAILURES =================================== 13:13:18 _______ TransportPCEFulltesting.test_09_add_omsAttributes_ROADMA_ROADMC ________ 13:13:18 13:13:18 self = 13:13:18 13:13:18 def test_09_add_omsAttributes_ROADMA_ROADMC(self): 13:13:18 # Config ROADMA-ROADMC oms-attributes 13:13:18 data = {"span": { 13:13:18 "auto-spanloss": "true", 13:13:18 "spanloss-base": 11.4, 13:13:18 "spanloss-current": 12, 13:13:18 "engineered-spanloss": 12.2, 13:13:18 "link-concatenation": [{ 13:13:18 "SRLG-Id": 0, 13:13:18 "fiber-type": "smf", 13:13:18 "SRLG-length": 100000, 13:13:18 "pmd": 0.5}]}} 13:13:18 response = test_utils.add_oms_attr_request( 13:13:18 "ROADM-A1-DEG2-DEG2-TTP-TXRXtoROADM-C1-DEG1-DEG1-TTP-TXRX", data) 13:13:18 > self.assertEqual(response.status_code, requests.codes.created) 13:13:18 E AssertionError: 204 != 201 13:13:18 13:13:18 transportpce_tests/hybrid/test01_device_change_notifications.py:161: AssertionError 13:13:18 ----------------------------- Captured stdout call ----------------------------- 13:13:18 execution of test_09_add_omsAttributes_ROADMA_ROADMC 13:13:18 _______ TransportPCEFulltesting.test_10_add_omsAttributes_ROADMC_ROADMA ________ 13:13:18 13:13:18 self = 13:13:18 13:13:18 def test_10_add_omsAttributes_ROADMC_ROADMA(self): 13:13:18 # Config ROADMC-ROADMA oms-attributes 13:13:18 data = {"span": { 13:13:18 "auto-spanloss": "true", 13:13:18 "spanloss-base": 11.4, 13:13:18 "spanloss-current": 12, 13:13:18 "engineered-spanloss": 12.2, 13:13:18 "link-concatenation": [{ 13:13:18 "SRLG-Id": 0, 13:13:18 "fiber-type": "smf", 13:13:18 "SRLG-length": 100000, 13:13:18 "pmd": 0.5}]}} 13:13:18 response = test_utils.add_oms_attr_request( 13:13:18 "ROADM-C1-DEG1-DEG1-TTP-TXRXtoROADM-A1-DEG2-DEG2-TTP-TXRX", data) 13:13:18 > self.assertEqual(response.status_code, requests.codes.created) 13:13:18 E AssertionError: 204 != 201 13:13:18 13:13:18 transportpce_tests/hybrid/test01_device_change_notifications.py:177: AssertionError 13:13:18 ----------------------------- Captured stdout call ----------------------------- 13:13:18 execution of test_10_add_omsAttributes_ROADMC_ROADMA 13:13:18 =========================== short test summary info ============================ 13:13:18 FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_09_add_omsAttributes_ROADMA_ROADMC 13:13:18 FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_10_add_omsAttributes_ROADMC_ROADMA 13:13:18 2 failed, 49 passed in 1579.53s (0:26:19) 13:13:18 tests_hybrid: exit 1 (1579.76 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh hybrid pid=45542 13:13:18 tests_hybrid: FAIL ✖ in 26 minutes 26.04 seconds 13:13:18 buildlighty: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 13:13:24 buildlighty: freeze> python -m pip freeze --all 13:13:24 buildlighty: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 13:13:24 buildlighty: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/lighty> ./build.sh 13:13:24 NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED 13:13:41 [ERROR] COMPILATION ERROR : 13:13:41 [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[17,42] cannot find symbol 13:13:41 symbol: class YangModuleInfo 13:13:41 location: package org.opendaylight.yangtools.binding 13:13:41 [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[21,30] cannot find symbol 13:13:41 symbol: class YangModuleInfo 13:13:41 location: class io.lighty.controllers.tpce.utils.TPCEUtils 13:13:41 [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[343,30] cannot find symbol 13:13:41 symbol: class YangModuleInfo 13:13:41 location: class io.lighty.controllers.tpce.utils.TPCEUtils 13:13:41 [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[350,23] cannot find symbol 13:13:41 symbol: class YangModuleInfo 13:13:41 location: class io.lighty.controllers.tpce.utils.TPCEUtils 13:13:41 [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.13.0:compile (default-compile) on project tpce: Compilation failure: Compilation failure: 13:13:41 [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[17,42] cannot find symbol 13:13:41 [ERROR] symbol: class YangModuleInfo 13:13:41 [ERROR] location: package org.opendaylight.yangtools.binding 13:13:41 [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[21,30] cannot find symbol 13:13:41 [ERROR] symbol: class YangModuleInfo 13:13:41 [ERROR] location: class io.lighty.controllers.tpce.utils.TPCEUtils 13:13:41 [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[343,30] cannot find symbol 13:13:41 [ERROR] symbol: class YangModuleInfo 13:13:41 [ERROR] location: class io.lighty.controllers.tpce.utils.TPCEUtils 13:13:41 [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[350,23] cannot find symbol 13:13:41 [ERROR] symbol: class YangModuleInfo 13:13:41 [ERROR] location: class io.lighty.controllers.tpce.utils.TPCEUtils 13:13:41 [ERROR] -> [Help 1] 13:13:41 [ERROR] 13:13:41 [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. 13:13:41 [ERROR] Re-run Maven using the -X switch to enable full debug logging. 13:13:41 [ERROR] 13:13:41 [ERROR] For more information about the errors and possible solutions, please read the following articles: 13:13:41 [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException 13:13:41 unzip: cannot find or open target/tpce-bin.zip, target/tpce-bin.zip.zip or target/tpce-bin.zip.ZIP. 13:13:41 buildlighty: exit 9 (16.89 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/lighty> ./build.sh pid=46846 13:13:41 buildlighty: command failed but is marked ignore outcome so handling it as success 13:13:41 buildcontroller: OK (111.93=setup[7.64]+cmd[104.28] seconds) 13:13:41 testsPCE: OK (338.09=setup[103.90]+cmd[234.19] seconds) 13:13:41 sims: OK (11.14=setup[7.11]+cmd[4.03] seconds) 13:13:41 build_karaf_tests121: OK (53.71=setup[7.12]+cmd[46.60] seconds) 13:13:41 tests121: FAIL code 1 (456.68=setup[6.79]+cmd[449.89] seconds) 13:13:41 build_karaf_tests221: OK (50.34=setup[7.08]+cmd[43.26] seconds) 13:13:41 tests_tapi: FAIL code 1 (725.03=setup[5.86]+cmd[719.17] seconds) 13:13:41 tests_network: FAIL code 1 (170.53=setup[6.22]+cmd[164.31] seconds) 13:13:41 tests221: FAIL code 1 (1484.80=setup[6.21]+cmd[1478.58] seconds) 13:13:41 build_karaf_tests71: OK (70.04=setup[14.40]+cmd[55.64] seconds) 13:13:41 tests71: OK (410.04=setup[5.92]+cmd[404.13] seconds) 13:13:41 build_karaf_tests_hybrid: OK (66.33=setup[9.15]+cmd[57.18] seconds) 13:13:41 tests_hybrid: FAIL code 1 (1586.03=setup[6.27]+cmd[1579.76] seconds) 13:13:41 buildlighty: OK (23.05=setup[6.16]+cmd[16.89] seconds) 13:13:41 docs: OK (33.02=setup[30.82]+cmd[2.20] seconds) 13:13:41 docs-linkcheck: OK (33.80=setup[30.27]+cmd[3.52] seconds) 13:13:41 checkbashisms: OK (2.93=setup[1.95]+cmd[0.03,0.05,0.90] seconds) 13:13:41 pre-commit: OK (48.92=setup[3.33]+cmd[0.01,0.00,38.85,6.73] seconds) 13:13:41 pylint: FAIL code 1 (27.15=setup[3.60]+cmd[23.55] seconds) 13:13:41 evaluation failed :( (4394.68 seconds) 13:13:41 + tox_status=255 13:13:41 + echo '---> Completed tox runs' 13:13:41 ---> Completed tox runs 13:13:41 + for i in .tox/*/log 13:13:41 ++ echo .tox/build_karaf_tests121/log 13:13:41 ++ awk -F/ '{print $2}' 13:13:41 + tox_env=build_karaf_tests121 13:13:41 + cp -r .tox/build_karaf_tests121/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests121 13:13:41 + for i in .tox/*/log 13:13:41 ++ echo .tox/build_karaf_tests221/log 13:13:41 ++ awk -F/ '{print $2}' 13:13:41 + tox_env=build_karaf_tests221 13:13:41 + cp -r .tox/build_karaf_tests221/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests221 13:13:41 + for i in .tox/*/log 13:13:41 ++ echo .tox/build_karaf_tests71/log 13:13:41 ++ awk -F/ '{print $2}' 13:13:41 + tox_env=build_karaf_tests71 13:13:41 + cp -r .tox/build_karaf_tests71/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests71 13:13:41 + for i in .tox/*/log 13:13:41 ++ echo .tox/build_karaf_tests_hybrid/log 13:13:41 ++ awk -F/ '{print $2}' 13:13:41 + tox_env=build_karaf_tests_hybrid 13:13:41 + cp -r .tox/build_karaf_tests_hybrid/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests_hybrid 13:13:41 + for i in .tox/*/log 13:13:41 ++ echo .tox/buildcontroller/log 13:13:41 ++ awk -F/ '{print $2}' 13:13:41 + tox_env=buildcontroller 13:13:41 + cp -r .tox/buildcontroller/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/buildcontroller 13:13:41 + for i in .tox/*/log 13:13:41 ++ echo .tox/buildlighty/log 13:13:41 ++ awk -F/ '{print $2}' 13:13:41 + tox_env=buildlighty 13:13:41 + cp -r .tox/buildlighty/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/buildlighty 13:13:41 + for i in .tox/*/log 13:13:41 ++ echo .tox/checkbashisms/log 13:13:41 ++ awk -F/ '{print $2}' 13:13:41 + tox_env=checkbashisms 13:13:41 + cp -r .tox/checkbashisms/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/checkbashisms 13:13:41 + for i in .tox/*/log 13:13:41 ++ echo .tox/docs-linkcheck/log 13:13:41 ++ awk -F/ '{print $2}' 13:13:41 + tox_env=docs-linkcheck 13:13:41 + cp -r .tox/docs-linkcheck/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/docs-linkcheck 13:13:41 + for i in .tox/*/log 13:13:41 ++ echo .tox/docs/log 13:13:41 ++ awk -F/ '{print $2}' 13:13:41 + tox_env=docs 13:13:41 + cp -r .tox/docs/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/docs 13:13:41 + for i in .tox/*/log 13:13:41 ++ echo .tox/pre-commit/log 13:13:41 ++ awk -F/ '{print $2}' 13:13:41 + tox_env=pre-commit 13:13:41 + cp -r .tox/pre-commit/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/pre-commit 13:13:41 + for i in .tox/*/log 13:13:41 ++ echo .tox/pylint/log 13:13:41 ++ awk -F/ '{print $2}' 13:13:41 + tox_env=pylint 13:13:41 + cp -r .tox/pylint/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/pylint 13:13:41 + for i in .tox/*/log 13:13:41 ++ echo .tox/sims/log 13:13:41 ++ awk -F/ '{print $2}' 13:13:41 + tox_env=sims 13:13:41 + cp -r .tox/sims/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/sims 13:13:41 + for i in .tox/*/log 13:13:41 ++ echo .tox/tests121/log 13:13:41 ++ awk -F/ '{print $2}' 13:13:41 + tox_env=tests121 13:13:41 + cp -r .tox/tests121/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests121 13:13:41 + for i in .tox/*/log 13:13:41 ++ echo .tox/tests221/log 13:13:41 ++ awk -F/ '{print $2}' 13:13:41 + tox_env=tests221 13:13:41 + cp -r .tox/tests221/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests221 13:13:41 + for i in .tox/*/log 13:13:41 ++ echo .tox/tests71/log 13:13:41 ++ awk -F/ '{print $2}' 13:13:41 + tox_env=tests71 13:13:41 + cp -r .tox/tests71/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests71 13:13:41 + for i in .tox/*/log 13:13:41 ++ echo .tox/testsPCE/log 13:13:41 ++ awk -F/ '{print $2}' 13:13:41 + tox_env=testsPCE 13:13:41 + cp -r .tox/testsPCE/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/testsPCE 13:13:41 + for i in .tox/*/log 13:13:41 ++ echo .tox/tests_hybrid/log 13:13:41 ++ awk -F/ '{print $2}' 13:13:41 + tox_env=tests_hybrid 13:13:41 + cp -r .tox/tests_hybrid/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests_hybrid 13:13:41 + for i in .tox/*/log 13:13:41 ++ echo .tox/tests_network/log 13:13:41 ++ awk -F/ '{print $2}' 13:13:41 + tox_env=tests_network 13:13:41 + cp -r .tox/tests_network/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests_network 13:13:41 + for i in .tox/*/log 13:13:41 ++ echo .tox/tests_tapi/log 13:13:41 ++ awk -F/ '{print $2}' 13:13:41 + tox_env=tests_tapi 13:13:41 + cp -r .tox/tests_tapi/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests_tapi 13:13:41 + DOC_DIR=docs/_build/html 13:13:41 + [[ -d docs/_build/html ]] 13:13:41 + echo '---> Archiving generated docs' 13:13:41 ---> Archiving generated docs 13:13:41 + mv docs/_build/html /w/workspace/transportpce-tox-verify-transportpce-master/archives/docs 13:13:41 + echo '---> tox-run.sh ends' 13:13:41 ---> tox-run.sh ends 13:13:41 + test 255 -eq 0 13:13:41 + exit 255 13:13:41 ++ '[' 1 = 1 ']' 13:13:41 ++ '[' -x /usr/bin/clear_console ']' 13:13:41 ++ /usr/bin/clear_console -q 13:13:41 Build step 'Execute shell' marked build as failure 13:13:41 $ ssh-agent -k 13:13:41 unset SSH_AUTH_SOCK; 13:13:41 unset SSH_AGENT_PID; 13:13:41 echo Agent pid 12556 killed; 13:13:41 [ssh-agent] Stopped. 13:13:41 [PostBuildScript] - [INFO] Executing post build scripts. 13:13:41 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins11888435382562282928.sh 13:13:41 ---> sysstat.sh 13:13:42 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins7589348460571432793.sh 13:13:42 ---> package-listing.sh 13:13:42 ++ tr '[:upper:]' '[:lower:]' 13:13:42 ++ facter osfamily 13:13:42 + OS_FAMILY=debian 13:13:42 + workspace=/w/workspace/transportpce-tox-verify-transportpce-master 13:13:42 + START_PACKAGES=/tmp/packages_start.txt 13:13:42 + END_PACKAGES=/tmp/packages_end.txt 13:13:42 + DIFF_PACKAGES=/tmp/packages_diff.txt 13:13:42 + PACKAGES=/tmp/packages_start.txt 13:13:42 + '[' /w/workspace/transportpce-tox-verify-transportpce-master ']' 13:13:42 + PACKAGES=/tmp/packages_end.txt 13:13:42 + case "${OS_FAMILY}" in 13:13:42 + grep '^ii' 13:13:42 + dpkg -l 13:13:42 + '[' -f /tmp/packages_start.txt ']' 13:13:42 + '[' -f /tmp/packages_end.txt ']' 13:13:42 + diff /tmp/packages_start.txt /tmp/packages_end.txt 13:13:42 + '[' /w/workspace/transportpce-tox-verify-transportpce-master ']' 13:13:42 + mkdir -p /w/workspace/transportpce-tox-verify-transportpce-master/archives/ 13:13:42 + cp -f /tmp/packages_diff.txt /tmp/packages_end.txt /tmp/packages_start.txt /w/workspace/transportpce-tox-verify-transportpce-master/archives/ 13:13:42 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins10581280690274049665.sh 13:13:42 ---> capture-instance-metadata.sh 13:13:42 Setup pyenv: 13:13:42 system 13:13:42 3.8.13 13:13:42 3.9.13 13:13:42 3.10.13 13:13:42 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 13:13:43 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-L3Bd from file:/tmp/.os_lf_venv 13:13:44 lf-activate-venv(): INFO: Installing: lftools 13:13:56 lf-activate-venv(): INFO: Adding /tmp/venv-L3Bd/bin to PATH 13:13:56 INFO: Running in OpenStack, capturing instance metadata 13:13:56 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins14823219369197572458.sh 13:13:56 provisioning config files... 13:13:56 Could not find credentials [logs] for transportpce-tox-verify-transportpce-master #2087 13:13:56 copy managed file [jenkins-log-archives-settings] to file:/w/workspace/transportpce-tox-verify-transportpce-master@tmp/config5832300866223612772tmp 13:13:56 Regular expression run condition: Expression=[^.*logs-s3.*], Label=[odl-logs-s3-cloudfront-index] 13:13:56 Run condition [Regular expression match] enabling perform for step [Provide Configuration files] 13:13:56 provisioning config files... 13:13:56 copy managed file [jenkins-s3-log-ship] to file:/home/jenkins/.aws/credentials 13:13:56 [EnvInject] - Injecting environment variables from a build step. 13:13:56 [EnvInject] - Injecting as environment variables the properties content 13:13:56 SERVER_ID=logs 13:13:56 13:13:56 [EnvInject] - Variables injected successfully. 13:13:56 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins3630728606982198269.sh 13:13:56 ---> create-netrc.sh 13:13:56 WARN: Log server credential not found. 13:13:57 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins11615536460021246885.sh 13:13:57 ---> python-tools-install.sh 13:13:57 Setup pyenv: 13:13:57 system 13:13:57 3.8.13 13:13:57 3.9.13 13:13:57 3.10.13 13:13:57 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 13:13:57 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-L3Bd from file:/tmp/.os_lf_venv 13:13:58 lf-activate-venv(): INFO: Installing: lftools 13:14:07 lf-activate-venv(): INFO: Adding /tmp/venv-L3Bd/bin to PATH 13:14:07 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins13170585731091401422.sh 13:14:07 ---> sudo-logs.sh 13:14:07 Archiving 'sudo' log.. 13:14:07 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins10763322531439224449.sh 13:14:07 ---> job-cost.sh 13:14:07 Setup pyenv: 13:14:07 system 13:14:07 3.8.13 13:14:07 3.9.13 13:14:07 3.10.13 13:14:07 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 13:14:07 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-L3Bd from file:/tmp/.os_lf_venv 13:14:09 lf-activate-venv(): INFO: Installing: zipp==1.1.0 python-openstackclient urllib3~=1.26.15 13:14:13 lf-activate-venv(): INFO: Adding /tmp/venv-L3Bd/bin to PATH 13:14:13 INFO: No Stack... 13:14:13 INFO: Retrieving Pricing Info for: v3-standard-4 13:14:14 INFO: Archiving Costs 13:14:14 [transportpce-tox-verify-transportpce-master] $ /bin/bash -l /tmp/jenkins2806705773580425914.sh 13:14:14 ---> logs-deploy.sh 13:14:14 Setup pyenv: 13:14:14 system 13:14:14 3.8.13 13:14:14 3.9.13 13:14:14 3.10.13 13:14:14 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 13:14:14 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-L3Bd from file:/tmp/.os_lf_venv 13:14:15 lf-activate-venv(): INFO: Installing: lftools 13:14:24 lf-activate-venv(): INFO: Adding /tmp/venv-L3Bd/bin to PATH 13:14:24 WARNING: Nexus logging server not set 13:14:24 INFO: S3 path logs/releng/vex-yul-odl-jenkins-1/transportpce-tox-verify-transportpce-master/2087/ 13:14:24 INFO: archiving logs to S3 13:14:26 ---> uname -a: 13:14:26 Linux prd-ubuntu2004-docker-4c-16g-43441 5.4.0-190-generic #210-Ubuntu SMP Fri Jul 5 17:03:38 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux 13:14:26 13:14:26 13:14:26 ---> lscpu: 13:14:26 Architecture: x86_64 13:14:26 CPU op-mode(s): 32-bit, 64-bit 13:14:26 Byte Order: Little Endian 13:14:26 Address sizes: 40 bits physical, 48 bits virtual 13:14:26 CPU(s): 4 13:14:26 On-line CPU(s) list: 0-3 13:14:26 Thread(s) per core: 1 13:14:26 Core(s) per socket: 1 13:14:26 Socket(s): 4 13:14:26 NUMA node(s): 1 13:14:26 Vendor ID: AuthenticAMD 13:14:26 CPU family: 23 13:14:26 Model: 49 13:14:26 Model name: AMD EPYC-Rome Processor 13:14:26 Stepping: 0 13:14:26 CPU MHz: 2799.998 13:14:26 BogoMIPS: 5599.99 13:14:26 Virtualization: AMD-V 13:14:26 Hypervisor vendor: KVM 13:14:26 Virtualization type: full 13:14:26 L1d cache: 128 KiB 13:14:26 L1i cache: 128 KiB 13:14:26 L2 cache: 2 MiB 13:14:26 L3 cache: 64 MiB 13:14:26 NUMA node0 CPU(s): 0-3 13:14:26 Vulnerability Gather data sampling: Not affected 13:14:26 Vulnerability Itlb multihit: Not affected 13:14:26 Vulnerability L1tf: Not affected 13:14:26 Vulnerability Mds: Not affected 13:14:26 Vulnerability Meltdown: Not affected 13:14:26 Vulnerability Mmio stale data: Not affected 13:14:26 Vulnerability Retbleed: Vulnerable 13:14:26 Vulnerability Spec store bypass: Mitigation; Speculative Store Bypass disabled via prctl and seccomp 13:14:26 Vulnerability Spectre v1: Mitigation; usercopy/swapgs barriers and __user pointer sanitization 13:14:26 Vulnerability Spectre v2: Mitigation; Retpolines; IBPB conditional; IBRS_FW; STIBP disabled; RSB filling; PBRSB-eIBRS Not affected; BHI Not affected 13:14:26 Vulnerability Srbds: Not affected 13:14:26 Vulnerability Tsx async abort: Not affected 13:14:26 Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy svm cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase tsc_adjust bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 clzero xsaveerptr wbnoinvd arat npt nrip_save umip rdpid arch_capabilities 13:14:26 13:14:26 13:14:26 ---> nproc: 13:14:26 4 13:14:26 13:14:26 13:14:26 ---> df -h: 13:14:26 Filesystem Size Used Avail Use% Mounted on 13:14:26 udev 7.8G 0 7.8G 0% /dev 13:14:26 tmpfs 1.6G 1.1M 1.6G 1% /run 13:14:26 /dev/vda1 78G 17G 62G 21% / 13:14:26 tmpfs 7.9G 0 7.9G 0% /dev/shm 13:14:26 tmpfs 5.0M 0 5.0M 0% /run/lock 13:14:26 tmpfs 7.9G 0 7.9G 0% /sys/fs/cgroup 13:14:26 /dev/loop1 44M 44M 0 100% /snap/snapd/15177 13:14:26 /dev/loop0 62M 62M 0 100% /snap/core20/1405 13:14:26 /dev/loop2 68M 68M 0 100% /snap/lxd/22753 13:14:26 /dev/vda15 105M 6.1M 99M 6% /boot/efi 13:14:26 tmpfs 1.6G 0 1.6G 0% /run/user/1001 13:14:26 /dev/loop3 92M 92M 0 100% /snap/lxd/29619 13:14:26 13:14:26 13:14:26 ---> free -m: 13:14:26 total used free shared buff/cache available 13:14:26 Mem: 15997 1858 4470 1 9667 13799 13:14:26 Swap: 1023 0 1023 13:14:26 13:14:26 13:14:26 ---> ip addr: 13:14:26 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 13:14:26 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 13:14:26 inet 127.0.0.1/8 scope host lo 13:14:26 valid_lft forever preferred_lft forever 13:14:26 inet6 ::1/128 scope host 13:14:26 valid_lft forever preferred_lft forever 13:14:26 2: ens3: mtu 1458 qdisc mq state UP group default qlen 1000 13:14:26 link/ether fa:16:3e:71:3e:32 brd ff:ff:ff:ff:ff:ff 13:14:26 inet 10.30.170.232/23 brd 10.30.171.255 scope global dynamic ens3 13:14:26 valid_lft 81831sec preferred_lft 81831sec 13:14:26 inet6 fe80::f816:3eff:fe71:3e32/64 scope link 13:14:26 valid_lft forever preferred_lft forever 13:14:26 3: docker0: mtu 1458 qdisc noqueue state DOWN group default 13:14:26 link/ether 02:42:38:3e:08:a6 brd ff:ff:ff:ff:ff:ff 13:14:26 inet 10.250.0.254/24 brd 10.250.0.255 scope global docker0 13:14:26 valid_lft forever preferred_lft forever 13:14:26 13:14:26 13:14:26 ---> sar -b -r -n DEV: 13:14:26 Linux 5.4.0-190-generic (prd-ubuntu2004-docker-4c-16g-43441) 10/17/24 _x86_64_ (4 CPU) 13:14:26 13:14:26 11:58:19 LINUX RESTART (4 CPU) 13:14:26 13:14:26 11:59:01 tps rtps wtps dtps bread/s bwrtn/s bdscd/s 13:14:26 12:00:01 254.94 80.70 174.24 0.00 2473.72 7773.52 0.00 13:14:26 12:01:01 190.40 31.89 158.51 0.00 2103.43 24945.55 0.00 13:14:26 12:02:01 132.04 11.01 121.03 0.00 700.42 44758.67 0.00 13:14:26 12:03:01 115.23 0.45 114.78 0.00 35.19 78034.33 0.00 13:14:26 12:04:01 181.17 13.08 168.09 0.00 2711.36 125035.52 0.00 13:14:26 12:05:01 147.94 2.68 145.26 0.00 294.08 29812.23 0.00 13:14:26 12:06:01 71.48 2.82 68.67 0.00 208.83 1311.74 0.00 13:14:26 12:07:01 126.68 0.17 126.51 0.00 22.26 2113.91 0.00 13:14:26 12:08:01 117.88 0.37 117.51 0.00 73.31 1980.01 0.00 13:14:26 12:09:01 110.21 0.13 110.08 0.00 10.53 9913.01 0.00 13:14:26 12:10:01 3.63 0.00 3.63 0.00 0.00 131.04 0.00 13:14:26 12:11:01 3.36 0.00 3.36 0.00 0.00 59.92 0.00 13:14:26 12:12:01 2.25 0.00 2.25 0.00 0.00 40.79 0.00 13:14:26 12:13:01 2.85 0.02 2.83 0.00 0.13 46.78 0.00 13:14:26 12:14:01 3.18 0.80 2.38 0.00 12.80 31.06 0.00 13:14:26 12:15:01 2.22 0.00 2.22 0.00 0.00 29.06 0.00 13:14:26 12:16:01 101.45 0.08 101.37 0.00 2.13 5149.28 0.00 13:14:26 12:17:01 76.29 0.03 76.25 0.00 0.93 6180.57 0.00 13:14:26 12:18:01 2.05 0.02 2.03 0.00 0.13 36.00 0.00 13:14:26 12:19:01 26.36 0.00 26.36 0.00 0.00 412.66 0.00 13:14:26 12:20:01 60.31 0.00 60.31 0.00 0.00 845.59 0.00 13:14:26 12:21:01 15.88 0.00 15.88 0.00 0.00 249.73 0.00 13:14:26 12:22:01 54.69 0.00 54.69 0.00 0.00 790.00 0.00 13:14:26 12:23:01 120.62 0.05 120.57 0.00 0.40 6430.92 0.00 13:14:26 12:24:01 25.71 0.00 25.71 0.00 0.00 461.26 0.00 13:14:26 12:25:01 82.12 0.00 82.12 0.00 0.00 1201.53 0.00 13:14:26 12:26:01 50.81 0.00 50.81 0.00 0.00 741.34 0.00 13:14:26 12:27:01 2.33 0.00 2.33 0.00 0.00 47.59 0.00 13:14:26 12:28:01 1.38 0.00 1.38 0.00 0.00 15.33 0.00 13:14:26 12:29:01 1.38 0.00 1.38 0.00 0.00 21.46 0.00 13:14:26 12:30:01 1.33 0.00 1.33 0.00 0.00 15.33 0.00 13:14:26 12:31:01 1.88 0.00 1.88 0.00 0.00 26.00 0.00 13:14:26 12:32:01 1.15 0.00 1.15 0.00 0.00 17.86 0.00 13:14:26 12:33:01 1.63 0.00 1.63 0.00 0.00 19.86 0.00 13:14:26 12:34:01 1.38 0.00 1.38 0.00 0.00 20.13 0.00 13:14:26 12:35:01 1.47 0.00 1.47 0.00 0.00 17.46 0.00 13:14:26 12:36:01 1.43 0.00 1.43 0.00 0.00 19.86 0.00 13:14:26 12:37:01 1.55 0.00 1.55 0.00 0.00 22.00 0.00 13:14:26 12:38:01 1.23 0.00 1.23 0.00 0.00 14.93 0.00 13:14:26 12:39:01 1.88 0.00 1.88 0.00 0.00 26.93 0.00 13:14:26 12:40:01 1.07 0.00 1.07 0.00 0.00 12.53 0.00 13:14:26 12:41:01 1.93 0.00 1.93 0.00 0.00 26.80 0.00 13:14:26 12:42:01 1.10 0.00 1.10 0.00 0.00 16.13 0.00 13:14:26 12:43:01 1.35 0.00 1.35 0.00 0.00 17.46 0.00 13:14:26 12:44:01 1.42 0.00 1.42 0.00 0.00 20.53 0.00 13:14:26 12:45:01 1.70 0.00 1.70 0.00 0.00 21.20 0.00 13:14:26 12:46:01 1.28 0.00 1.28 0.00 0.00 18.39 0.00 13:14:26 12:47:01 8.93 0.00 8.93 0.00 0.00 575.24 0.00 13:14:26 12:48:01 50.02 0.00 50.02 0.00 0.00 8758.41 0.00 13:14:26 12:49:01 2.97 0.00 2.97 0.00 0.00 64.39 0.00 13:14:26 12:50:01 1.45 0.00 1.45 0.00 0.00 19.06 0.00 13:14:26 12:51:01 3.15 0.00 3.15 0.00 0.00 159.28 0.00 13:14:26 12:52:01 1.50 0.00 1.50 0.00 0.00 21.60 0.00 13:14:26 12:53:01 1.55 0.00 1.55 0.00 0.00 21.46 0.00 13:14:26 12:54:01 1.97 0.00 1.97 0.00 0.00 28.13 0.00 13:14:26 12:55:01 2.05 0.00 2.05 0.00 0.00 25.60 0.00 13:14:26 12:56:01 1.48 0.00 1.48 0.00 0.00 19.33 0.00 13:14:26 12:57:01 1.90 0.00 1.90 0.00 0.00 25.46 0.00 13:14:26 12:58:01 1.57 0.00 1.57 0.00 0.00 22.26 0.00 13:14:26 12:59:01 1.87 0.00 1.87 0.00 0.00 26.40 0.00 13:14:26 13:00:01 1.80 0.00 1.80 0.00 0.00 24.26 0.00 13:14:26 13:01:01 2.63 0.05 2.58 0.00 4.27 63.32 0.00 13:14:26 13:02:01 2.25 0.00 2.25 0.00 0.00 31.06 0.00 13:14:26 13:03:01 1.98 0.00 1.98 0.00 0.00 25.33 0.00 13:14:26 13:04:01 2.03 0.00 2.03 0.00 0.00 25.73 0.00 13:14:26 13:05:01 2.73 0.00 2.73 0.00 0.00 35.19 0.00 13:14:26 13:06:01 1.72 0.00 1.72 0.00 0.00 21.33 0.00 13:14:26 13:07:01 2.10 0.00 2.10 0.00 0.00 26.93 0.00 13:14:26 13:08:01 1.90 0.00 1.90 0.00 0.00 24.93 0.00 13:14:26 13:09:01 2.05 0.00 2.05 0.00 0.00 25.73 0.00 13:14:26 13:10:01 1.72 0.00 1.72 0.00 0.00 21.60 0.00 13:14:26 13:11:01 2.37 0.00 2.37 0.00 0.00 32.13 0.00 13:14:26 13:12:01 1.48 0.00 1.48 0.00 0.00 19.06 0.00 13:14:26 13:13:01 1.95 0.00 1.95 0.00 0.00 23.60 0.00 13:14:26 13:14:01 33.46 5.97 27.50 0.00 276.75 3198.93 0.00 13:14:26 Average: 30.06 2.00 28.06 0.00 119.11 4831.97 0.00 13:14:26 13:14:26 11:59:01 kbmemfree kbavail kbmemused %memused kbbuffers kbcached kbcommit %commit kbactive kbinact kbdirty 13:14:26 12:00:01 13399552 15452244 529072 3.23 69768 2179444 1225040 7.03 786676 1922312 167068 13:14:26 12:01:01 10867908 14490392 1468428 8.96 129000 3575148 2247936 12.90 1888064 3210952 1053172 13:14:26 12:02:01 9589776 13935472 2022500 12.35 152076 4232900 2540476 14.58 2573708 3759660 427420 13:14:26 12:03:01 6514124 13149076 2807752 17.14 181100 6399932 3628948 20.82 4055424 5255616 359088 13:14:26 12:04:01 4922764 13811500 2134064 13.03 215252 8517908 2994648 17.18 3983972 6802040 44968 13:14:26 12:05:01 2374356 11670996 4273172 26.09 226004 8902292 5761360 33.05 6463516 6851392 3852 13:14:26 12:06:01 684828 9166492 6776064 41.36 222864 8104628 7741992 44.42 8818296 6186868 1412 13:14:26 12:07:01 162368 8652840 7289188 44.50 229384 8106720 8564284 49.14 9360028 6167336 868 13:14:26 12:08:01 6227168 14748032 1197812 7.31 234076 8130028 2043172 11.72 3297200 6186388 42016 13:14:26 12:09:01 987020 9706928 6235744 38.07 241732 8315252 7475200 42.89 8375424 6322856 2088 13:14:26 12:10:01 704468 9424848 6517524 39.79 241784 8315736 7654680 43.92 8656044 6322828 116 13:14:26 12:11:01 680436 9401240 6541064 39.93 241824 8316076 7654680 43.92 8680472 6323060 148 13:14:26 12:12:01 657668 9378776 6563436 40.07 241852 8316344 7670676 44.01 8703496 6323328 160 13:14:26 12:13:01 591800 9313084 6629092 40.47 241920 8316448 7702764 44.19 8767608 6323164 88 13:14:26 12:14:01 587416 9309228 6633060 40.49 242024 8316776 7721128 44.30 8769260 6323168 72 13:14:26 12:15:01 587644 9309492 6632856 40.49 242056 8316780 7721128 44.30 8771472 6323168 100 13:14:26 12:16:01 4760376 13730852 2214432 13.52 250376 8543480 2994784 17.18 4444332 6482536 143500 13:14:26 12:17:01 4733812 13706768 2238424 13.66 252416 8543840 3040856 17.45 4492844 6460940 328 13:14:26 12:18:01 4713892 13687080 2258028 13.78 252456 8544020 3040856 17.45 4513288 6461072 128 13:14:26 12:19:01 4677808 13651528 2293472 14.00 252748 8544144 3458116 19.84 4560832 6449600 664 13:14:26 12:20:01 4115204 13090632 2853896 17.42 254176 8544400 3670600 21.06 5119552 6449712 208 13:14:26 12:21:01 6022908 14998496 947368 5.78 254204 8544424 1782144 10.22 3222040 6449432 244 13:14:26 12:22:01 4829196 13806772 2138256 13.05 255852 8544716 2913352 16.71 4409996 6449716 84 13:14:26 12:23:01 2237544 11354144 4588924 28.01 261652 8665376 6234856 35.77 6862604 6567644 2112 13:14:26 12:24:01 3233884 12351128 3592756 21.93 261908 8665656 4942832 28.36 5874100 6562676 672 13:14:26 12:25:01 3291636 12410592 3533624 21.57 262920 8666264 5053248 28.99 5826704 6554520 1064 13:14:26 12:26:01 1453348 10573520 5369944 32.78 263920 8666436 6591744 37.82 7659628 6554692 296 13:14:26 12:27:01 4089772 13209996 2734308 16.69 263928 8666472 3766964 21.61 5031288 6553640 144 13:14:26 12:28:01 4089876 13210104 2734372 16.69 263928 8666476 3766964 21.61 5031376 6553644 196 13:14:26 12:29:01 4087144 13207380 2737100 16.71 263944 8666476 3767380 21.61 5033520 6553644 112 13:14:26 12:30:01 4086972 13207220 2737232 16.71 263948 8666480 3767380 21.61 5033476 6553648 36 13:14:26 12:31:01 4087240 13207508 2736916 16.71 263960 8666484 3764608 21.60 5033340 6553652 36 13:14:26 12:32:01 4087548 13207832 2736504 16.71 263968 8666488 3761152 21.58 5033784 6553656 248 13:14:26 12:33:01 4087344 13207632 2736704 16.71 263972 8666488 3761152 21.58 5033612 6553656 20 13:14:26 12:34:01 4087888 13208184 2736228 16.70 263972 8666492 3768832 21.62 5033676 6553660 228 13:14:26 12:35:01 4087500 13207800 2736720 16.71 263976 8666492 3768832 21.62 5033884 6553660 24 13:14:26 12:36:01 4087896 13208112 2736452 16.70 263988 8666492 3761660 21.58 5034172 6553656 20 13:14:26 12:37:01 4087392 13207624 2737004 16.71 263996 8666496 3768332 21.62 5034452 6553660 272 13:14:26 12:38:01 4086896 13207132 2737456 16.71 263996 8666500 3768332 21.62 5034384 6553660 184 13:14:26 12:39:01 4086384 13206644 2737916 16.71 264016 8666500 3760736 21.58 5034476 6553668 236 13:14:26 12:40:01 4086440 13206708 2737848 16.71 264016 8666508 3760736 21.58 5034320 6553672 28 13:14:26 12:41:01 4086620 13206892 2737652 16.71 264020 8666512 3758720 21.56 5034452 6553676 32 13:14:26 12:42:01 4087108 13207396 2737120 16.71 264028 8666512 3755520 21.55 5034664 6553676 276 13:14:26 12:43:01 4086384 13206676 2737768 16.71 264028 8666516 3771512 21.64 5034664 6553680 272 13:14:26 12:44:01 4086628 13206924 2737536 16.71 264036 8666516 3771984 21.64 5034852 6553680 292 13:14:26 12:45:01 4086556 13206868 2737596 16.71 264044 8666520 3771984 21.64 5034816 6553684 24 13:14:26 12:46:01 4086604 13206916 2737568 16.71 264048 8666520 3757120 21.56 5034908 6553684 212 13:14:26 12:47:01 3825308 13027064 2917552 17.81 266708 8738456 4002912 22.97 5222368 6619540 66904 13:14:26 12:48:01 520000 9885796 6056880 36.97 270388 8892044 7358276 42.22 8400904 6729344 1028 13:14:26 12:49:01 519788 9885516 6057096 36.98 270396 8892052 7374288 42.31 8400224 6729340 40 13:14:26 12:50:01 521604 9887340 6055272 36.96 270404 8892052 7374288 42.31 8398060 6729324 80 13:14:26 12:51:01 479740 9846324 6096268 37.21 270424 8892740 7374288 42.31 8438464 6729596 204 13:14:26 12:52:01 478244 9844832 6097760 37.22 270428 8892740 7374288 42.31 8441592 6729596 56 13:14:26 12:53:01 477480 9844072 6098472 37.23 270432 8892744 7390348 42.40 8441672 6729600 100 13:14:26 12:54:01 322436 9689096 6253392 38.17 270448 8892796 7440076 42.69 8595740 6729656 312 13:14:26 12:55:01 322232 9688908 6253580 38.18 270452 8892808 7440076 42.69 8595400 6729660 100 13:14:26 12:56:01 321728 9688408 6254052 38.18 270460 8892808 7440076 42.69 8595936 6729664 84 13:14:26 12:57:01 188804 9555544 6386920 38.99 270464 8892856 7489996 42.97 8727652 6729712 192 13:14:26 12:58:01 189212 9555960 6386508 38.99 270468 8892856 7489996 42.97 8727856 6729712 96 13:14:26 12:59:01 188528 9555356 6387064 38.99 270484 8892924 7489996 42.97 8728540 6729780 300 13:14:26 13:00:01 1253436 10548800 5393524 32.92 270052 8822804 6430568 36.89 7724328 6671088 424 13:14:26 13:01:01 1091276 10387140 5555008 33.91 270084 8823268 6531592 37.47 7886312 6671272 52 13:14:26 13:02:01 1090928 10386812 5555324 33.91 270088 8823276 6531592 37.47 7885640 6671284 44 13:14:26 13:03:01 1089024 10384912 5557208 33.92 270092 8823276 6531592 37.47 7887744 6671284 104 13:14:26 13:04:01 1088276 10384172 5557912 33.93 270100 8823280 6531592 37.47 7887428 6671288 264 13:14:26 13:05:01 1088220 10384136 5557952 33.93 270108 8823288 6531592 37.47 7887752 6671296 60 13:14:26 13:06:01 1088220 10384144 5557940 33.93 270116 8823292 6547720 37.57 7887784 6671300 48 13:14:26 13:07:01 1088244 10384188 5557892 33.93 270132 8823296 6547720 37.57 7887836 6671304 88 13:14:26 13:08:01 1086976 10383072 5558992 33.93 270140 8823436 6547720 37.57 7889416 6671444 56 13:14:26 13:09:01 1085976 10382076 5559960 33.94 270144 8823436 6547720 37.57 7890944 6671444 272 13:14:26 13:10:01 1084440 10380552 5561484 33.95 270152 8823440 6547720 37.57 7892252 6671448 280 13:14:26 13:11:01 1082156 10378284 5563732 33.96 270180 8823448 6547720 37.57 7894204 6671456 64 13:14:26 13:12:01 1080660 10376824 5565180 33.97 270188 8823456 6547720 37.57 7895592 6671464 36 13:14:26 13:13:01 1078880 10375064 5566932 33.98 270204 8823456 6547720 37.57 7896948 6671464 52 13:14:26 13:14:01 4644184 14163836 1779712 10.86 275340 9028116 2610396 14.98 4154540 6850516 138308 13:14:26 Average: 2849748 11707626 4236900 25.86 253656 8428927 5286547 30.33 6423704 6409917 32860 13:14:26 13:14:26 11:59:01 IFACE rxpck/s txpck/s rxkB/s txkB/s rxcmp/s txcmp/s rxmcst/s %ifutil 13:14:26 12:00:01 ens3 109.45 81.40 883.50 19.01 0.00 0.00 0.00 0.00 13:14:26 12:00:01 lo 0.80 0.80 0.08 0.08 0.00 0.00 0.00 0.00 13:14:26 12:00:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:01:01 ens3 438.14 356.90 6459.69 39.79 0.00 0.00 0.00 0.00 13:14:26 12:01:01 lo 6.26 6.26 0.63 0.63 0.00 0.00 0.00 0.00 13:14:26 12:01:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:02:01 ens3 296.02 256.86 4751.81 26.51 0.00 0.00 0.00 0.00 13:14:26 12:02:01 lo 0.53 0.53 0.05 0.05 0.00 0.00 0.00 0.00 13:14:26 12:02:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:03:01 ens3 107.68 74.90 2222.59 9.19 0.00 0.00 0.00 0.00 13:14:26 12:03:01 lo 1.20 1.20 0.11 0.11 0.00 0.00 0.00 0.00 13:14:26 12:03:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:04:01 ens3 172.38 94.30 3137.27 8.43 0.00 0.00 0.00 0.00 13:14:26 12:04:01 lo 1.73 1.73 0.17 0.17 0.00 0.00 0.00 0.00 13:14:26 12:04:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:05:01 ens3 1.73 1.48 0.67 0.43 0.00 0.00 0.00 0.00 13:14:26 12:05:01 lo 5.07 5.07 1.32 1.32 0.00 0.00 0.00 0.00 13:14:26 12:05:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:06:01 ens3 1.63 1.43 0.30 0.27 0.00 0.00 0.00 0.00 13:14:26 12:06:01 lo 48.53 48.53 48.56 48.56 0.00 0.00 0.00 0.00 13:14:26 12:06:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:07:01 ens3 1.17 1.03 0.20 0.18 0.00 0.00 0.00 0.00 13:14:26 12:07:01 lo 27.86 27.86 14.64 14.64 0.00 0.00 0.00 0.00 13:14:26 12:07:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:08:01 ens3 2.17 2.35 0.87 0.76 0.00 0.00 0.00 0.00 13:14:26 12:08:01 lo 20.26 20.26 8.48 8.48 0.00 0.00 0.00 0.00 13:14:26 12:08:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:09:01 ens3 1.08 1.15 0.25 0.20 0.00 0.00 0.00 0.00 13:14:26 12:09:01 lo 18.35 18.35 19.36 19.36 0.00 0.00 0.00 0.00 13:14:26 12:09:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:10:01 ens3 0.57 0.65 0.11 0.11 0.00 0.00 0.00 0.00 13:14:26 12:10:01 lo 22.85 22.85 13.31 13.31 0.00 0.00 0.00 0.00 13:14:26 12:10:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:11:01 ens3 0.46 0.36 0.06 0.05 0.00 0.00 0.00 0.00 13:14:26 12:11:01 lo 19.73 19.73 9.45 9.45 0.00 0.00 0.00 0.00 13:14:26 12:11:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:12:01 ens3 0.25 0.27 0.04 0.04 0.00 0.00 0.00 0.00 13:14:26 12:12:01 lo 21.33 21.33 7.15 7.15 0.00 0.00 0.00 0.00 13:14:26 12:12:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:13:01 ens3 1.05 0.55 0.14 0.09 0.00 0.00 0.00 0.00 13:14:26 12:13:01 lo 3.52 3.52 1.70 1.70 0.00 0.00 0.00 0.00 13:14:26 12:13:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:14:01 ens3 0.70 0.47 0.40 0.27 0.00 0.00 0.00 0.00 13:14:26 12:14:01 lo 0.20 0.20 0.02 0.02 0.00 0.00 0.00 0.00 13:14:26 12:14:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:15:01 ens3 0.27 0.08 0.02 0.01 0.00 0.00 0.00 0.00 13:14:26 12:15:01 lo 0.50 0.50 0.05 0.05 0.00 0.00 0.00 0.00 13:14:26 12:15:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:16:01 ens3 15.45 13.33 3.83 9.38 0.00 0.00 0.00 0.00 13:14:26 12:16:01 lo 9.80 9.80 15.81 15.81 0.00 0.00 0.00 0.00 13:14:26 12:16:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:17:01 ens3 1.32 1.02 0.47 0.38 0.00 0.00 0.00 0.00 13:14:26 12:17:01 lo 14.61 14.61 7.40 7.40 0.00 0.00 0.00 0.00 13:14:26 12:17:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:18:01 ens3 1.25 1.00 0.24 0.21 0.00 0.00 0.00 0.00 13:14:26 12:18:01 lo 27.73 27.73 9.00 9.00 0.00 0.00 0.00 0.00 13:14:26 12:18:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:19:01 ens3 1.28 1.12 0.33 0.25 0.00 0.00 0.00 0.00 13:14:26 12:19:01 lo 16.41 16.41 4.84 4.84 0.00 0.00 0.00 0.00 13:14:26 12:19:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:20:01 ens3 0.87 0.75 0.17 0.16 0.00 0.00 0.00 0.00 13:14:26 12:20:01 lo 38.33 38.33 18.74 18.74 0.00 0.00 0.00 0.00 13:14:26 12:20:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:21:01 ens3 1.27 1.10 0.24 0.22 0.00 0.00 0.00 0.00 13:14:26 12:21:01 lo 22.92 22.92 6.78 6.78 0.00 0.00 0.00 0.00 13:14:26 12:21:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:22:01 ens3 0.83 0.72 0.16 0.14 0.00 0.00 0.00 0.00 13:14:26 12:22:01 lo 26.28 26.28 11.54 11.54 0.00 0.00 0.00 0.00 13:14:26 12:22:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:23:01 ens3 3.22 3.73 1.53 1.36 0.00 0.00 0.00 0.00 13:14:26 12:23:01 lo 9.96 9.96 2.87 2.87 0.00 0.00 0.00 0.00 13:14:26 12:23:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:24:01 ens3 1.77 1.52 0.44 0.35 0.00 0.00 0.00 0.00 13:14:26 12:24:01 lo 10.75 10.75 7.78 7.78 0.00 0.00 0.00 0.00 13:14:26 12:24:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:25:01 ens3 7.08 5.45 1.41 3.47 0.00 0.00 0.00 0.00 13:14:26 12:25:01 lo 14.18 14.18 4.77 4.77 0.00 0.00 0.00 0.00 13:14:26 12:25:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:26:01 ens3 2.03 1.20 0.73 0.55 0.00 0.00 0.00 0.00 13:14:26 12:26:01 lo 15.86 15.86 8.05 8.05 0.00 0.00 0.00 0.00 13:14:26 12:26:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:27:01 ens3 2.10 1.30 0.77 0.58 0.00 0.00 0.00 0.00 13:14:26 12:27:01 lo 15.35 15.35 6.14 6.14 0.00 0.00 0.00 0.00 13:14:26 12:27:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:28:01 ens3 1.10 0.35 0.33 0.21 0.00 0.00 0.00 0.00 13:14:26 12:28:01 lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:28:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:29:01 ens3 1.25 0.23 0.25 0.08 0.00 0.00 0.00 0.00 13:14:26 12:29:01 lo 0.20 0.20 0.01 0.01 0.00 0.00 0.00 0.00 13:14:26 12:29:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:30:01 ens3 0.97 0.40 0.53 0.39 0.00 0.00 0.00 0.00 13:14:26 12:30:01 lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:30:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:31:01 ens3 0.25 0.12 0.02 0.01 0.00 0.00 0.00 0.00 13:14:26 12:31:01 lo 0.20 0.20 0.01 0.01 0.00 0.00 0.00 0.00 13:14:26 12:31:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:32:01 ens3 1.18 0.40 0.34 0.21 0.00 0.00 0.00 0.00 13:14:26 12:32:01 lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:32:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:33:01 ens3 0.50 0.40 0.27 0.22 0.00 0.00 0.00 0.00 13:14:26 12:33:01 lo 0.20 0.20 0.01 0.01 0.00 0.00 0.00 0.00 13:14:26 12:33:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:34:01 ens3 0.35 0.15 0.09 0.07 0.00 0.00 0.00 0.00 13:14:26 12:34:01 lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:34:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:35:01 ens3 0.37 0.17 0.08 0.01 0.00 0.00 0.00 0.00 13:14:26 12:35:01 lo 0.20 0.20 0.01 0.01 0.00 0.00 0.00 0.00 13:14:26 12:35:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:36:01 ens3 0.23 0.08 0.02 0.01 0.00 0.00 0.00 0.00 13:14:26 12:36:01 lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:36:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:37:01 ens3 0.93 0.48 0.32 0.23 0.00 0.00 0.00 0.00 13:14:26 12:37:01 lo 0.20 0.20 0.01 0.01 0.00 0.00 0.00 0.00 13:14:26 12:37:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:38:01 ens3 0.17 0.00 0.01 0.00 0.00 0.00 0.00 0.00 13:14:26 12:38:01 lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:38:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:39:01 ens3 0.38 0.30 0.10 0.08 0.00 0.00 0.00 0.00 13:14:26 12:39:01 lo 0.20 0.20 0.01 0.01 0.00 0.00 0.00 0.00 13:14:26 12:39:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:40:01 ens3 0.23 0.10 0.07 0.01 0.00 0.00 0.00 0.00 13:14:26 12:40:01 lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:40:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:41:01 ens3 0.30 0.13 0.02 0.01 0.00 0.00 0.00 0.00 13:14:26 12:41:01 lo 0.20 0.20 0.01 0.01 0.00 0.00 0.00 0.00 13:14:26 12:41:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:42:01 ens3 0.17 0.08 0.01 0.01 0.00 0.00 0.00 0.00 13:14:26 12:42:01 lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:42:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:43:01 ens3 0.17 0.10 0.01 0.01 0.00 0.00 0.00 0.00 13:14:26 12:43:01 lo 0.20 0.20 0.01 0.01 0.00 0.00 0.00 0.00 13:14:26 12:43:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:44:01 ens3 0.40 0.25 0.15 0.08 0.00 0.00 0.00 0.00 13:14:26 12:44:01 lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:44:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:45:01 ens3 0.32 0.10 0.02 0.01 0.00 0.00 0.00 0.00 13:14:26 12:45:01 lo 0.20 0.20 0.01 0.01 0.00 0.00 0.00 0.00 13:14:26 12:45:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:46:01 ens3 0.15 0.07 0.01 0.01 0.00 0.00 0.00 0.00 13:14:26 12:46:01 lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:46:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:47:01 ens3 2.20 2.37 0.89 1.11 0.00 0.00 0.00 0.00 13:14:26 12:47:01 lo 0.40 0.40 0.03 0.03 0.00 0.00 0.00 0.00 13:14:26 12:47:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:48:01 ens3 0.75 0.28 0.08 0.03 0.00 0.00 0.00 0.00 13:14:26 12:48:01 lo 3.72 3.72 1.68 1.68 0.00 0.00 0.00 0.00 13:14:26 12:48:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:49:01 ens3 1.50 0.73 0.74 0.51 0.00 0.00 0.00 0.00 13:14:26 12:49:01 lo 0.32 0.32 0.02 0.02 0.00 0.00 0.00 0.00 13:14:26 12:49:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:50:01 ens3 0.15 0.00 0.01 0.00 0.00 0.00 0.00 0.00 13:14:26 12:50:01 lo 0.18 0.18 0.02 0.02 0.00 0.00 0.00 0.00 13:14:26 12:50:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:51:01 ens3 0.18 0.13 0.02 0.01 0.00 0.00 0.00 0.00 13:14:26 12:51:01 lo 4.68 4.68 11.43 11.43 0.00 0.00 0.00 0.00 13:14:26 12:51:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:52:01 ens3 0.13 0.00 0.01 0.00 0.00 0.00 0.00 0.00 13:14:26 12:52:01 lo 0.28 0.28 0.03 0.03 0.00 0.00 0.00 0.00 13:14:26 12:52:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:53:01 ens3 0.23 0.10 0.02 0.01 0.00 0.00 0.00 0.00 13:14:26 12:53:01 lo 0.48 0.48 0.04 0.04 0.00 0.00 0.00 0.00 13:14:26 12:53:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:54:01 ens3 0.33 0.17 0.14 0.07 0.00 0.00 0.00 0.00 13:14:26 12:54:01 lo 5.65 5.65 2.42 2.42 0.00 0.00 0.00 0.00 13:14:26 12:54:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:55:01 ens3 0.18 0.10 0.01 0.01 0.00 0.00 0.00 0.00 13:14:26 12:55:01 lo 0.62 0.62 0.05 0.05 0.00 0.00 0.00 0.00 13:14:26 12:55:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:56:01 ens3 0.10 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:56:01 lo 0.50 0.50 0.05 0.05 0.00 0.00 0.00 0.00 13:14:26 12:56:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:57:01 ens3 0.27 0.13 0.02 0.01 0.00 0.00 0.00 0.00 13:14:26 12:57:01 lo 5.72 5.72 2.28 2.28 0.00 0.00 0.00 0.00 13:14:26 12:57:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:58:01 ens3 0.15 0.00 0.01 0.00 0.00 0.00 0.00 0.00 13:14:26 12:58:01 lo 0.60 0.60 0.06 0.06 0.00 0.00 0.00 0.00 13:14:26 12:58:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 12:59:01 ens3 0.45 0.20 0.15 0.07 0.00 0.00 0.00 0.00 13:14:26 12:59:01 lo 0.83 0.83 0.08 0.08 0.00 0.00 0.00 0.00 13:14:26 12:59:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 13:00:01 ens3 0.53 0.08 0.06 0.01 0.00 0.00 0.00 0.00 13:14:26 13:00:01 lo 14.50 14.50 4.30 4.30 0.00 0.00 0.00 0.00 13:14:26 13:00:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 13:01:01 ens3 1.72 1.50 0.54 0.46 0.00 0.00 0.00 0.00 13:14:26 13:01:01 lo 35.14 35.14 14.76 14.76 0.00 0.00 0.00 0.00 13:14:26 13:01:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 13:02:01 ens3 1.00 0.05 0.11 0.01 0.00 0.00 0.00 0.00 13:14:26 13:02:01 lo 1.40 1.40 0.63 0.63 0.00 0.00 0.00 0.00 13:14:26 13:02:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 13:03:01 ens3 1.00 0.52 0.53 0.41 0.00 0.00 0.00 0.00 13:14:26 13:03:01 lo 0.75 0.75 0.08 0.08 0.00 0.00 0.00 0.00 13:14:26 13:03:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 13:04:01 ens3 0.30 0.23 0.14 0.07 0.00 0.00 0.00 0.00 13:14:26 13:04:01 lo 0.40 0.40 0.03 0.03 0.00 0.00 0.00 0.00 13:14:26 13:04:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 13:05:01 ens3 0.28 0.12 0.02 0.01 0.00 0.00 0.00 0.00 13:14:26 13:05:01 lo 0.87 0.87 0.08 0.08 0.00 0.00 0.00 0.00 13:14:26 13:05:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 13:06:01 ens3 0.03 0.02 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 13:06:01 lo 0.25 0.25 0.02 0.02 0.00 0.00 0.00 0.00 13:14:26 13:06:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 13:07:01 ens3 0.30 0.12 0.02 0.01 0.00 0.00 0.00 0.00 13:14:26 13:07:01 lo 0.55 0.55 0.06 0.06 0.00 0.00 0.00 0.00 13:14:26 13:07:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 13:08:01 ens3 0.23 0.08 0.02 0.01 0.00 0.00 0.00 0.00 13:14:26 13:08:01 lo 0.40 0.40 0.03 0.03 0.00 0.00 0.00 0.00 13:14:26 13:08:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 13:09:01 ens3 0.68 0.25 0.13 0.08 0.00 0.00 0.00 0.00 13:14:26 13:09:01 lo 0.40 0.40 0.04 0.04 0.00 0.00 0.00 0.00 13:14:26 13:09:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 13:10:01 ens3 0.55 0.27 0.36 0.23 0.00 0.00 0.00 0.00 13:14:26 13:10:01 lo 0.13 0.13 0.01 0.01 0.00 0.00 0.00 0.00 13:14:26 13:10:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 13:11:01 ens3 0.33 0.13 0.03 0.01 0.00 0.00 0.00 0.00 13:14:26 13:11:01 lo 0.47 0.47 0.03 0.03 0.00 0.00 0.00 0.00 13:14:26 13:11:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 13:12:01 ens3 0.03 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 13:12:01 lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 13:12:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 13:13:01 ens3 0.22 0.10 0.01 0.01 0.00 0.00 0.00 0.00 13:14:26 13:13:01 lo 0.20 0.20 0.01 0.01 0.00 0.00 0.00 0.00 13:14:26 13:13:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 13:14:01 ens3 160.57 126.23 1959.60 29.64 0.00 0.00 0.00 0.00 13:14:26 13:14:01 lo 0.62 0.62 0.06 0.06 0.00 0.00 0.00 0.00 13:14:26 13:14:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 Average: ens3 18.10 13.93 259.20 2.09 0.00 0.00 0.00 0.00 13:14:26 Average: lo 6.70 6.70 3.56 3.56 0.00 0.00 0.00 0.00 13:14:26 Average: docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 13:14:26 13:14:26 13:14:26 ---> sar -P ALL: 13:14:26 Linux 5.4.0-190-generic (prd-ubuntu2004-docker-4c-16g-43441) 10/17/24 _x86_64_ (4 CPU) 13:14:26 13:14:26 11:58:19 LINUX RESTART (4 CPU) 13:14:26 13:14:26 11:59:01 CPU %user %nice %system %iowait %steal %idle 13:14:26 12:00:01 all 16.20 11.23 9.18 3.36 0.09 59.94 13:14:26 12:00:01 0 18.49 10.14 8.64 2.07 0.07 60.59 13:14:26 12:00:01 1 11.75 10.74 9.08 4.48 0.12 63.84 13:14:26 12:00:01 2 30.19 11.10 10.90 4.49 0.08 43.24 13:14:26 12:00:01 3 4.40 12.96 8.09 2.40 0.08 72.08 13:14:26 12:01:01 all 52.72 0.00 4.26 4.64 0.10 38.29 13:14:26 12:01:01 0 49.38 0.00 3.78 3.60 0.08 43.15 13:14:26 12:01:01 1 51.68 0.00 4.61 5.64 0.10 37.96 13:14:26 12:01:01 2 65.27 0.00 4.55 3.10 0.10 26.98 13:14:26 12:01:01 3 44.57 0.00 4.09 6.22 0.10 45.02 13:14:26 12:02:01 all 65.32 0.00 2.31 1.52 0.10 30.76 13:14:26 12:02:01 0 55.86 0.00 2.51 3.07 0.12 38.44 13:14:26 12:02:01 1 72.42 0.00 1.34 0.68 0.10 25.46 13:14:26 12:02:01 2 66.05 0.00 2.48 1.69 0.08 29.70 13:14:26 12:02:01 3 67.02 0.00 2.90 0.62 0.10 29.35 13:14:26 12:03:01 all 64.44 0.00 3.17 5.82 0.11 26.47 13:14:26 12:03:01 0 55.18 0.00 3.05 9.91 0.12 31.75 13:14:26 12:03:01 1 55.58 0.00 2.72 9.22 0.10 32.37 13:14:26 12:03:01 2 59.23 0.00 2.95 2.46 0.08 35.27 13:14:26 12:03:01 3 87.90 0.00 3.95 1.67 0.12 6.37 13:14:26 12:04:01 all 65.15 0.00 4.01 22.94 0.14 7.76 13:14:26 12:04:01 0 72.11 0.00 3.65 15.06 0.17 9.00 13:14:26 12:04:01 1 62.72 0.00 3.44 24.55 0.14 9.14 13:14:26 12:04:01 2 61.28 0.00 4.40 27.43 0.14 6.75 13:14:26 12:04:01 3 64.46 0.00 4.56 24.71 0.12 6.15 13:14:26 12:05:01 all 80.29 0.00 2.74 1.45 0.13 15.40 13:14:26 12:05:01 0 82.29 0.00 2.92 1.00 0.12 13.67 13:14:26 12:05:01 1 79.79 0.00 2.41 1.87 0.13 15.79 13:14:26 12:05:01 2 79.29 0.00 2.93 1.89 0.14 15.75 13:14:26 12:05:01 3 79.76 0.00 2.70 1.03 0.13 16.38 13:14:26 12:06:01 all 47.51 0.00 1.51 0.24 0.10 50.64 13:14:26 12:06:01 0 48.58 0.00 1.44 0.03 0.10 49.85 13:14:26 12:06:01 1 49.16 0.00 1.70 0.10 0.10 48.94 13:14:26 12:06:01 2 46.99 0.00 1.56 0.49 0.10 50.86 13:14:26 12:06:01 3 45.30 0.00 1.34 0.35 0.10 52.91 13:14:26 12:07:01 all 35.49 0.00 1.16 0.56 0.10 62.68 13:14:26 12:07:01 0 37.66 0.00 1.31 0.60 0.10 60.33 13:14:26 12:07:01 1 36.82 0.00 1.02 0.72 0.12 61.32 13:14:26 12:07:01 2 31.48 0.00 1.22 0.85 0.10 66.35 13:14:26 12:07:01 3 36.02 0.00 1.11 0.05 0.10 62.73 13:14:26 12:08:01 all 59.13 0.00 2.08 0.15 0.10 38.53 13:14:26 12:08:01 0 59.81 0.00 2.35 0.05 0.12 37.68 13:14:26 12:08:01 1 59.30 0.00 2.20 0.34 0.08 38.09 13:14:26 12:08:01 2 59.17 0.00 2.13 0.15 0.12 38.43 13:14:26 12:08:01 3 58.24 0.00 1.64 0.08 0.10 39.93 13:14:26 12:09:01 all 71.20 0.00 2.22 0.28 0.11 26.19 13:14:26 12:09:01 0 71.29 0.00 2.33 0.08 0.10 26.19 13:14:26 12:09:01 1 70.22 0.00 2.45 0.17 0.12 27.04 13:14:26 12:09:01 2 71.83 0.00 2.21 0.20 0.10 25.65 13:14:26 12:09:01 3 71.46 0.00 1.87 0.67 0.13 25.87 13:14:26 12:10:01 all 9.92 0.00 0.44 0.05 0.11 89.47 13:14:26 12:10:01 0 10.05 0.00 0.52 0.00 0.12 89.31 13:14:26 12:10:01 1 10.06 0.00 0.42 0.12 0.10 89.31 13:14:26 12:10:01 2 9.80 0.00 0.40 0.00 0.12 89.68 13:14:26 12:10:01 3 9.78 0.00 0.43 0.10 0.10 89.58 13:14:26 13:14:26 12:10:01 CPU %user %nice %system %iowait %steal %idle 13:14:26 12:11:01 all 4.62 0.00 0.34 0.02 0.10 94.92 13:14:26 12:11:01 0 4.35 0.00 0.36 0.00 0.10 95.19 13:14:26 12:11:01 1 4.61 0.00 0.32 0.03 0.10 94.94 13:14:26 12:11:01 2 4.06 0.00 0.31 0.00 0.10 95.53 13:14:26 12:11:01 3 5.46 0.00 0.39 0.03 0.10 94.01 13:14:26 12:12:01 all 3.38 0.00 0.34 0.01 0.09 96.19 13:14:26 12:12:01 0 2.93 0.00 0.25 0.00 0.08 96.73 13:14:26 12:12:01 1 2.85 0.00 0.35 0.00 0.10 96.70 13:14:26 12:12:01 2 3.65 0.00 0.42 0.00 0.08 95.85 13:14:26 12:12:01 3 4.08 0.00 0.33 0.03 0.08 95.47 13:14:26 12:13:01 all 2.47 0.00 0.27 0.02 0.08 97.16 13:14:26 12:13:01 0 3.70 0.00 0.36 0.00 0.08 95.85 13:14:26 12:13:01 1 1.89 0.00 0.25 0.02 0.07 97.77 13:14:26 12:13:01 2 2.08 0.00 0.25 0.02 0.08 97.57 13:14:26 12:13:01 3 2.19 0.00 0.22 0.05 0.07 97.48 13:14:26 12:14:01 all 0.96 0.00 0.28 0.02 0.09 98.66 13:14:26 12:14:01 0 2.02 0.00 0.40 0.00 0.10 97.48 13:14:26 12:14:01 1 0.45 0.00 0.20 0.02 0.08 99.24 13:14:26 12:14:01 2 0.58 0.00 0.30 0.00 0.08 99.03 13:14:26 12:14:01 3 0.75 0.00 0.22 0.05 0.08 98.89 13:14:26 12:15:01 all 0.65 0.00 0.23 0.01 0.10 99.01 13:14:26 12:15:01 0 0.69 0.00 0.34 0.00 0.10 98.87 13:14:26 12:15:01 1 0.35 0.00 0.20 0.03 0.10 99.31 13:14:26 12:15:01 2 0.67 0.00 0.22 0.02 0.08 99.01 13:14:26 12:15:01 3 0.87 0.00 0.18 0.00 0.12 98.83 13:14:26 12:16:01 all 38.30 0.00 1.52 0.94 0.09 59.16 13:14:26 12:16:01 0 40.87 0.00 1.32 0.92 0.08 56.80 13:14:26 12:16:01 1 43.52 0.00 2.01 1.60 0.07 52.81 13:14:26 12:16:01 2 36.23 0.00 1.52 1.22 0.10 60.94 13:14:26 12:16:01 3 32.58 0.00 1.22 0.00 0.12 66.08 13:14:26 12:17:01 all 31.45 0.00 0.90 0.57 0.09 66.99 13:14:26 12:17:01 0 31.79 0.00 0.84 0.07 0.10 67.21 13:14:26 12:17:01 1 32.61 0.00 0.99 1.02 0.08 65.30 13:14:26 12:17:01 2 31.00 0.00 1.00 0.73 0.08 67.18 13:14:26 12:17:01 3 30.40 0.00 0.76 0.46 0.10 68.27 13:14:26 12:18:01 all 3.07 0.00 0.18 0.01 0.09 96.65 13:14:26 12:18:01 0 3.16 0.00 0.17 0.00 0.08 96.59 13:14:26 12:18:01 1 3.12 0.00 0.12 0.02 0.10 96.65 13:14:26 12:18:01 2 3.06 0.00 0.22 0.03 0.07 96.62 13:14:26 12:18:01 3 2.94 0.00 0.22 0.00 0.10 96.74 13:14:26 12:19:01 all 26.06 0.00 0.91 0.03 0.10 72.90 13:14:26 12:19:01 0 22.88 0.00 0.80 0.00 0.08 76.23 13:14:26 12:19:01 1 27.67 0.00 0.83 0.02 0.10 71.38 13:14:26 12:19:01 2 26.76 0.00 1.22 0.05 0.10 71.87 13:14:26 12:19:01 3 26.95 0.00 0.80 0.03 0.10 72.11 13:14:26 12:20:01 all 16.98 0.00 0.43 0.23 0.09 82.27 13:14:26 12:20:01 0 15.94 0.00 0.38 0.02 0.07 83.59 13:14:26 12:20:01 1 17.46 0.00 0.44 0.00 0.08 82.02 13:14:26 12:20:01 2 17.65 0.00 0.40 0.45 0.10 81.39 13:14:26 12:20:01 3 16.88 0.00 0.50 0.43 0.10 82.08 13:14:26 12:21:01 all 9.45 0.00 0.53 0.03 0.10 89.89 13:14:26 12:21:01 0 10.12 0.00 0.40 0.03 0.10 89.34 13:14:26 12:21:01 1 9.30 0.00 0.55 0.02 0.08 90.05 13:14:26 12:21:01 2 9.90 0.00 0.52 0.07 0.10 89.41 13:14:26 12:21:01 3 8.46 0.00 0.65 0.02 0.10 90.77 13:14:26 13:14:26 12:21:01 CPU %user %nice %system %iowait %steal %idle 13:14:26 12:22:01 all 24.70 0.00 0.60 0.31 0.09 74.30 13:14:26 12:22:01 0 25.87 0.00 0.69 0.02 0.08 73.34 13:14:26 12:22:01 1 22.18 0.00 0.55 0.05 0.10 77.11 13:14:26 12:22:01 2 23.70 0.00 0.75 1.19 0.08 74.28 13:14:26 12:22:01 3 27.03 0.00 0.40 0.00 0.08 72.49 13:14:26 12:23:01 all 74.07 0.00 2.64 0.21 0.09 22.98 13:14:26 12:23:01 0 69.98 0.00 2.37 0.08 0.08 27.48 13:14:26 12:23:01 1 77.02 0.00 3.04 0.08 0.10 19.76 13:14:26 12:23:01 2 71.83 0.00 2.35 0.64 0.08 25.11 13:14:26 12:23:01 3 77.46 0.00 2.80 0.05 0.10 19.59 13:14:26 12:24:01 all 31.21 0.00 1.08 0.03 0.10 67.57 13:14:26 12:24:01 0 29.96 0.00 1.04 0.08 0.10 68.82 13:14:26 12:24:01 1 31.25 0.00 1.46 0.02 0.10 67.17 13:14:26 12:24:01 2 30.65 0.00 1.00 0.03 0.08 68.23 13:14:26 12:24:01 3 33.00 0.00 0.84 0.00 0.10 66.06 13:14:26 12:25:01 all 40.63 0.00 1.33 0.34 0.09 57.61 13:14:26 12:25:01 0 40.50 0.00 1.32 0.02 0.10 58.06 13:14:26 12:25:01 1 42.41 0.00 1.12 0.03 0.08 56.35 13:14:26 12:25:01 2 40.15 0.00 1.34 1.32 0.08 57.10 13:14:26 12:25:01 3 39.44 0.00 1.54 0.00 0.10 58.92 13:14:26 12:26:01 all 19.62 0.00 0.72 0.25 0.10 79.31 13:14:26 12:26:01 0 18.31 0.00 0.55 0.02 0.12 81.00 13:14:26 12:26:01 1 20.64 0.00 0.98 0.40 0.10 77.88 13:14:26 12:26:01 2 19.54 0.00 0.55 0.58 0.08 79.25 13:14:26 12:26:01 3 20.00 0.00 0.80 0.00 0.08 79.12 13:14:26 12:27:01 all 3.36 0.00 0.42 0.02 0.08 96.13 13:14:26 12:27:01 0 2.82 0.00 0.47 0.00 0.07 96.65 13:14:26 12:27:01 1 3.22 0.00 0.38 0.00 0.08 96.32 13:14:26 12:27:01 2 3.16 0.00 0.32 0.07 0.07 96.39 13:14:26 12:27:01 3 4.23 0.00 0.50 0.00 0.08 95.19 13:14:26 12:28:01 all 0.40 0.00 0.18 0.01 0.08 99.32 13:14:26 12:28:01 0 0.20 0.00 0.10 0.00 0.07 99.63 13:14:26 12:28:01 1 0.48 0.00 0.20 0.00 0.10 99.22 13:14:26 12:28:01 2 0.38 0.00 0.10 0.03 0.07 99.42 13:14:26 12:28:01 3 0.53 0.00 0.33 0.00 0.10 99.03 13:14:26 12:29:01 all 0.75 0.00 0.25 0.01 0.06 98.93 13:14:26 12:29:01 0 0.95 0.00 0.33 0.00 0.07 98.65 13:14:26 12:29:01 1 0.60 0.00 0.18 0.00 0.07 99.15 13:14:26 12:29:01 2 0.71 0.00 0.10 0.03 0.03 99.12 13:14:26 12:29:01 3 0.75 0.00 0.38 0.00 0.07 98.80 13:14:26 12:30:01 all 0.89 0.00 0.18 0.01 0.06 98.86 13:14:26 12:30:01 0 0.48 0.00 0.18 0.00 0.05 99.28 13:14:26 12:30:01 1 1.76 0.00 0.16 0.00 0.07 98.01 13:14:26 12:30:01 2 0.95 0.00 0.27 0.03 0.08 98.67 13:14:26 12:30:01 3 0.37 0.00 0.12 0.00 0.03 99.48 13:14:26 12:31:01 all 0.59 0.00 0.23 0.01 0.09 99.08 13:14:26 12:31:01 0 0.60 0.00 0.42 0.00 0.12 98.87 13:14:26 12:31:01 1 0.28 0.00 0.17 0.00 0.10 99.45 13:14:26 12:31:01 2 0.67 0.00 0.17 0.03 0.07 99.07 13:14:26 12:31:01 3 0.81 0.00 0.17 0.00 0.08 98.94 13:14:26 12:32:01 all 0.45 0.00 0.27 0.01 0.07 99.20 13:14:26 12:32:01 0 0.55 0.00 0.32 0.00 0.08 99.05 13:14:26 12:32:01 1 0.45 0.00 0.28 0.00 0.08 99.18 13:14:26 12:32:01 2 0.45 0.00 0.22 0.03 0.07 99.23 13:14:26 12:32:01 3 0.37 0.00 0.25 0.00 0.05 99.33 13:14:26 13:14:26 12:32:01 CPU %user %nice %system %iowait %steal %idle 13:14:26 12:33:01 all 0.37 0.00 0.19 0.01 0.08 99.35 13:14:26 12:33:01 0 0.33 0.00 0.20 0.00 0.08 99.38 13:14:26 12:33:01 1 0.43 0.00 0.18 0.00 0.10 99.28 13:14:26 12:33:01 2 0.18 0.00 0.02 0.05 0.03 99.72 13:14:26 12:33:01 3 0.52 0.00 0.37 0.00 0.10 99.02 13:14:26 12:34:01 all 0.40 0.00 0.21 0.01 0.05 99.33 13:14:26 12:34:01 0 0.22 0.00 0.10 0.00 0.05 99.63 13:14:26 12:34:01 1 0.80 0.00 0.45 0.00 0.08 98.67 13:14:26 12:34:01 2 0.32 0.00 0.13 0.02 0.03 99.50 13:14:26 12:34:01 3 0.27 0.00 0.15 0.02 0.05 99.52 13:14:26 12:35:01 all 0.71 0.00 0.19 0.01 0.07 99.03 13:14:26 12:35:01 0 1.79 0.00 0.10 0.00 0.05 98.06 13:14:26 12:35:01 1 0.62 0.00 0.40 0.00 0.10 98.88 13:14:26 12:35:01 2 0.30 0.00 0.13 0.03 0.05 99.48 13:14:26 12:35:01 3 0.12 0.00 0.12 0.00 0.07 99.70 13:14:26 12:36:01 all 0.44 0.00 0.20 0.01 0.07 99.29 13:14:26 12:36:01 0 0.57 0.00 0.23 0.00 0.08 99.11 13:14:26 12:36:01 1 0.58 0.00 0.32 0.00 0.07 99.03 13:14:26 12:36:01 2 0.25 0.00 0.12 0.03 0.07 99.53 13:14:26 12:36:01 3 0.35 0.00 0.12 0.02 0.05 99.47 13:14:26 12:37:01 all 0.45 0.00 0.25 0.01 0.06 99.22 13:14:26 12:37:01 0 0.40 0.00 0.30 0.00 0.05 99.25 13:14:26 12:37:01 1 0.77 0.00 0.42 0.00 0.08 98.73 13:14:26 12:37:01 2 0.37 0.00 0.22 0.02 0.08 99.31 13:14:26 12:37:01 3 0.28 0.00 0.08 0.02 0.02 99.60 13:14:26 12:38:01 all 0.33 0.00 0.19 0.01 0.07 99.40 13:14:26 12:38:01 0 0.40 0.00 0.28 0.00 0.08 99.23 13:14:26 12:38:01 1 0.22 0.00 0.13 0.00 0.05 99.60 13:14:26 12:38:01 2 0.34 0.00 0.10 0.03 0.07 99.46 13:14:26 12:38:01 3 0.38 0.00 0.23 0.02 0.07 99.30 13:14:26 12:39:01 all 0.50 0.00 0.28 0.01 0.08 99.13 13:14:26 12:39:01 0 0.33 0.00 0.23 0.00 0.07 99.37 13:14:26 12:39:01 1 0.45 0.00 0.17 0.00 0.08 99.30 13:14:26 12:39:01 2 0.65 0.00 0.40 0.03 0.10 98.81 13:14:26 12:39:01 3 0.57 0.00 0.33 0.00 0.05 99.05 13:14:26 12:40:01 all 0.30 0.00 0.17 0.00 0.08 99.45 13:14:26 12:40:01 0 0.22 0.00 0.08 0.00 0.07 99.63 13:14:26 12:40:01 1 0.27 0.00 0.18 0.00 0.07 99.48 13:14:26 12:40:01 2 0.25 0.00 0.07 0.02 0.05 99.61 13:14:26 12:40:01 3 0.45 0.00 0.35 0.00 0.12 99.08 13:14:26 12:41:01 all 0.36 0.00 0.20 0.01 0.06 99.37 13:14:26 12:41:01 0 0.23 0.00 0.13 0.00 0.05 99.58 13:14:26 12:41:01 1 0.37 0.00 0.13 0.00 0.05 99.45 13:14:26 12:41:01 2 0.47 0.00 0.34 0.02 0.10 99.08 13:14:26 12:41:01 3 0.37 0.00 0.18 0.02 0.05 99.38 13:14:26 12:42:01 all 0.37 0.00 0.20 0.01 0.07 99.35 13:14:26 12:42:01 0 0.38 0.00 0.22 0.00 0.10 99.30 13:14:26 12:42:01 1 0.52 0.00 0.33 0.00 0.10 99.05 13:14:26 12:42:01 2 0.28 0.00 0.17 0.03 0.05 99.46 13:14:26 12:42:01 3 0.28 0.00 0.10 0.00 0.03 99.58 13:14:26 12:43:01 all 0.26 0.00 0.17 0.01 0.08 99.48 13:14:26 12:43:01 0 0.23 0.00 0.10 0.00 0.07 99.60 13:14:26 12:43:01 1 0.43 0.00 0.37 0.00 0.10 99.10 13:14:26 12:43:01 2 0.15 0.00 0.10 0.03 0.07 99.65 13:14:26 12:43:01 3 0.23 0.00 0.10 0.00 0.08 99.58 13:14:26 13:14:26 12:43:01 CPU %user %nice %system %iowait %steal %idle 13:14:26 12:44:01 all 0.38 0.00 0.21 0.01 0.07 99.33 13:14:26 12:44:01 0 0.33 0.00 0.22 0.02 0.07 99.37 13:14:26 12:44:01 1 0.68 0.00 0.42 0.00 0.10 98.80 13:14:26 12:44:01 2 0.23 0.00 0.10 0.03 0.05 99.58 13:14:26 12:44:01 3 0.25 0.00 0.12 0.00 0.05 99.58 13:14:26 12:45:01 all 0.33 0.00 0.16 0.01 0.08 99.42 13:14:26 12:45:01 0 0.33 0.00 0.22 0.00 0.08 99.37 13:14:26 12:45:01 1 0.35 0.00 0.13 0.00 0.10 99.41 13:14:26 12:45:01 2 0.22 0.00 0.05 0.03 0.05 99.65 13:14:26 12:45:01 3 0.40 0.00 0.25 0.00 0.10 99.25 13:14:26 12:46:01 all 0.45 0.00 0.22 0.01 0.08 99.25 13:14:26 12:46:01 0 0.80 0.00 0.42 0.00 0.10 98.69 13:14:26 12:46:01 1 0.33 0.00 0.17 0.00 0.08 99.41 13:14:26 12:46:01 2 0.37 0.00 0.18 0.03 0.07 99.35 13:14:26 12:46:01 3 0.28 0.00 0.12 0.00 0.05 99.55 13:14:26 12:47:01 all 4.95 0.00 0.51 0.10 0.07 94.37 13:14:26 12:47:01 0 3.57 0.00 0.52 0.00 0.08 95.83 13:14:26 12:47:01 1 3.18 0.00 0.27 0.00 0.05 96.50 13:14:26 12:47:01 2 2.85 0.00 0.37 0.35 0.05 96.38 13:14:26 12:47:01 3 10.21 0.00 0.88 0.03 0.10 88.78 13:14:26 12:48:01 all 40.76 0.00 1.36 0.52 0.10 57.26 13:14:26 12:48:01 0 42.17 0.00 1.79 0.07 0.10 55.87 13:14:26 12:48:01 1 42.03 0.00 1.04 0.00 0.08 56.85 13:14:26 12:48:01 2 36.63 0.00 1.06 1.83 0.10 60.38 13:14:26 12:48:01 3 42.20 0.00 1.56 0.20 0.10 55.94 13:14:26 12:49:01 all 0.96 0.00 0.24 0.04 0.10 98.66 13:14:26 12:49:01 0 0.57 0.00 0.35 0.00 0.10 98.98 13:14:26 12:49:01 1 0.62 0.00 0.18 0.00 0.08 99.11 13:14:26 12:49:01 2 2.06 0.00 0.28 0.07 0.10 97.50 13:14:26 12:49:01 3 0.60 0.00 0.15 0.08 0.10 99.06 13:14:26 12:50:01 all 0.75 0.00 0.25 0.01 0.08 98.91 13:14:26 12:50:01 0 0.58 0.00 0.18 0.00 0.08 99.15 13:14:26 12:50:01 1 0.32 0.00 0.20 0.00 0.07 99.42 13:14:26 12:50:01 2 1.09 0.00 0.44 0.00 0.08 98.39 13:14:26 12:50:01 3 1.02 0.00 0.17 0.03 0.08 98.70 13:14:26 12:51:01 all 2.51 0.00 0.27 0.03 0.09 97.10 13:14:26 12:51:01 0 2.72 0.00 0.32 0.00 0.08 96.88 13:14:26 12:51:01 1 2.33 0.00 0.20 0.00 0.08 97.39 13:14:26 12:51:01 2 2.73 0.00 0.27 0.03 0.10 96.87 13:14:26 12:51:01 3 2.27 0.00 0.28 0.08 0.08 97.28 13:14:26 12:52:01 all 0.64 0.00 0.25 0.01 0.07 99.03 13:14:26 12:52:01 0 0.50 0.00 0.22 0.00 0.07 99.22 13:14:26 12:52:01 1 0.84 0.00 0.29 0.00 0.08 98.79 13:14:26 12:52:01 2 0.79 0.00 0.28 0.02 0.07 98.85 13:14:26 12:52:01 3 0.42 0.00 0.22 0.02 0.07 99.28 13:14:26 12:53:01 all 0.48 0.00 0.27 0.01 0.08 99.16 13:14:26 12:53:01 0 0.57 0.00 0.30 0.00 0.07 99.07 13:14:26 12:53:01 1 0.59 0.00 0.37 0.00 0.10 98.94 13:14:26 12:53:01 2 0.40 0.00 0.20 0.02 0.08 99.30 13:14:26 12:53:01 3 0.35 0.00 0.20 0.03 0.08 99.33 13:14:26 12:54:01 all 2.73 0.00 0.24 0.01 0.09 96.92 13:14:26 12:54:01 0 3.20 0.00 0.35 0.00 0.10 96.35 13:14:26 12:54:01 1 2.58 0.00 0.30 0.00 0.08 97.03 13:14:26 12:54:01 2 2.43 0.00 0.25 0.02 0.10 97.21 13:14:26 12:54:01 3 2.71 0.00 0.07 0.03 0.08 97.11 13:14:26 13:14:26 12:54:01 CPU %user %nice %system %iowait %steal %idle 13:14:26 12:55:01 all 0.97 0.00 0.27 0.01 0.09 98.67 13:14:26 12:55:01 0 1.72 0.00 0.25 0.00 0.08 97.95 13:14:26 12:55:01 1 1.42 0.00 0.37 0.00 0.10 98.12 13:14:26 12:55:01 2 0.40 0.00 0.23 0.02 0.10 99.25 13:14:26 12:55:01 3 0.32 0.00 0.22 0.02 0.08 99.36 13:14:26 12:56:01 all 0.75 0.00 0.30 0.01 0.07 98.87 13:14:26 12:56:01 0 0.88 0.00 0.22 0.00 0.07 98.83 13:14:26 12:56:01 1 1.00 0.00 0.28 0.00 0.07 98.65 13:14:26 12:56:01 2 0.50 0.00 0.33 0.02 0.08 99.06 13:14:26 12:56:01 3 0.60 0.00 0.38 0.02 0.07 98.93 13:14:26 12:57:01 all 1.91 0.00 0.25 0.01 0.09 97.74 13:14:26 12:57:01 0 1.90 0.00 0.15 0.02 0.07 97.86 13:14:26 12:57:01 1 2.20 0.00 0.30 0.00 0.10 97.40 13:14:26 12:57:01 2 1.80 0.00 0.18 0.00 0.08 97.93 13:14:26 12:57:01 3 1.72 0.00 0.37 0.02 0.12 97.77 13:14:26 12:58:01 all 0.63 0.00 0.20 0.01 0.08 99.07 13:14:26 12:58:01 0 0.22 0.00 0.07 0.02 0.05 99.65 13:14:26 12:58:01 1 1.37 0.00 0.17 0.00 0.10 98.37 13:14:26 12:58:01 2 0.37 0.00 0.23 0.00 0.10 99.30 13:14:26 12:58:01 3 0.59 0.00 0.33 0.02 0.08 98.98 13:14:26 12:59:01 all 1.21 0.00 0.23 0.02 0.07 98.46 13:14:26 12:59:01 0 0.47 0.00 0.17 0.02 0.10 99.25 13:14:26 12:59:01 1 2.69 0.00 0.03 0.02 0.05 97.21 13:14:26 12:59:01 2 1.17 0.00 0.23 0.00 0.07 98.53 13:14:26 12:59:01 3 0.50 0.00 0.50 0.03 0.08 98.88 13:14:26 13:00:01 all 3.81 0.00 0.41 0.01 0.09 95.69 13:14:26 13:00:01 0 4.13 0.00 0.30 0.02 0.07 95.49 13:14:26 13:00:01 1 4.09 0.00 0.48 0.00 0.12 95.30 13:14:26 13:00:01 2 3.19 0.00 0.27 0.00 0.07 96.48 13:14:26 13:00:01 3 3.81 0.00 0.59 0.03 0.10 95.47 13:14:26 13:01:01 all 5.17 0.00 0.39 0.02 0.08 94.34 13:14:26 13:01:01 0 4.75 0.00 0.52 0.00 0.07 94.66 13:14:26 13:01:01 1 5.64 0.00 0.18 0.00 0.08 94.09 13:14:26 13:01:01 2 5.11 0.00 0.47 0.03 0.10 94.28 13:14:26 13:01:01 3 5.17 0.00 0.40 0.03 0.07 94.32 13:14:26 13:02:01 all 0.92 0.00 0.24 0.01 0.08 98.74 13:14:26 13:02:01 0 0.50 0.00 0.28 0.00 0.08 99.13 13:14:26 13:02:01 1 2.39 0.00 0.31 0.02 0.08 97.20 13:14:26 13:02:01 2 0.27 0.00 0.15 0.02 0.08 99.48 13:14:26 13:02:01 3 0.52 0.00 0.20 0.02 0.08 99.18 13:14:26 13:03:01 all 0.93 0.00 0.22 0.01 0.07 98.76 13:14:26 13:03:01 0 0.53 0.00 0.35 0.00 0.07 99.05 13:14:26 13:03:01 1 2.51 0.00 0.33 0.00 0.07 97.10 13:14:26 13:03:01 2 0.28 0.00 0.10 0.02 0.07 99.53 13:14:26 13:03:01 3 0.38 0.00 0.12 0.03 0.07 99.40 13:14:26 13:04:01 all 0.92 0.00 0.25 0.01 0.07 98.74 13:14:26 13:04:01 0 0.42 0.00 0.27 0.00 0.08 99.23 13:14:26 13:04:01 1 2.18 0.00 0.20 0.00 0.07 97.56 13:14:26 13:04:01 2 0.57 0.00 0.13 0.02 0.07 99.21 13:14:26 13:04:01 3 0.50 0.00 0.40 0.03 0.08 98.98 13:14:26 13:05:01 all 0.93 0.00 0.29 0.03 0.09 98.66 13:14:26 13:05:01 0 0.45 0.00 0.18 0.00 0.08 99.28 13:14:26 13:05:01 1 2.00 0.00 0.48 0.00 0.10 97.42 13:14:26 13:05:01 2 0.43 0.00 0.15 0.05 0.08 99.28 13:14:26 13:05:01 3 0.84 0.00 0.33 0.07 0.10 98.66 13:14:26 13:14:26 13:05:01 CPU %user %nice %system %iowait %steal %idle 13:14:26 13:06:01 all 0.49 0.00 0.25 0.01 0.08 99.17 13:14:26 13:06:01 0 0.35 0.00 0.17 0.00 0.07 99.42 13:14:26 13:06:01 1 0.74 0.00 0.42 0.00 0.08 98.76 13:14:26 13:06:01 2 0.40 0.00 0.18 0.02 0.08 99.32 13:14:26 13:06:01 3 0.47 0.00 0.22 0.03 0.08 99.20 13:14:26 13:07:01 all 0.53 0.00 0.30 0.01 0.10 99.06 13:14:26 13:07:01 0 0.65 0.00 0.23 0.00 0.10 99.02 13:14:26 13:07:01 1 0.39 0.00 0.34 0.00 0.10 99.18 13:14:26 13:07:01 2 0.47 0.00 0.28 0.02 0.10 99.13 13:14:26 13:07:01 3 0.62 0.00 0.33 0.03 0.10 98.91 13:14:26 13:08:01 all 0.39 0.00 0.26 0.01 0.09 99.26 13:14:26 13:08:01 0 0.53 0.00 0.30 0.00 0.08 99.08 13:14:26 13:08:01 1 0.29 0.00 0.18 0.00 0.10 99.43 13:14:26 13:08:01 2 0.44 0.00 0.37 0.02 0.08 99.10 13:14:26 13:08:01 3 0.30 0.00 0.18 0.02 0.08 99.42 13:14:26 13:09:01 all 0.36 0.00 0.23 0.02 0.08 99.31 13:14:26 13:09:01 0 0.42 0.00 0.20 0.00 0.08 99.30 13:14:26 13:09:01 1 0.25 0.00 0.17 0.00 0.07 99.52 13:14:26 13:09:01 2 0.42 0.00 0.28 0.03 0.08 99.18 13:14:26 13:09:01 3 0.37 0.00 0.27 0.03 0.08 99.25 13:14:26 13:10:01 all 0.39 0.00 0.24 0.01 0.08 99.28 13:14:26 13:10:01 0 0.39 0.00 0.15 0.00 0.08 99.38 13:14:26 13:10:01 1 0.27 0.00 0.17 0.00 0.07 99.50 13:14:26 13:10:01 2 0.35 0.00 0.20 0.02 0.07 99.37 13:14:26 13:10:01 3 0.55 0.00 0.45 0.02 0.08 98.89 13:14:26 13:11:01 all 0.38 0.00 0.18 0.01 0.10 99.33 13:14:26 13:11:01 0 0.49 0.00 0.27 0.00 0.12 99.13 13:14:26 13:11:01 1 0.35 0.00 0.10 0.00 0.08 99.47 13:14:26 13:11:01 2 0.35 0.00 0.17 0.02 0.08 99.38 13:14:26 13:11:01 3 0.34 0.00 0.18 0.03 0.10 99.34 13:14:26 13:12:01 all 0.38 0.00 0.31 0.00 0.10 99.21 13:14:26 13:12:01 0 0.37 0.00 0.27 0.00 0.10 99.26 13:14:26 13:12:01 1 0.62 0.00 0.50 0.00 0.10 98.78 13:14:26 13:12:01 2 0.23 0.00 0.18 0.00 0.08 99.50 13:14:26 13:12:01 3 0.30 0.00 0.27 0.02 0.10 99.31 13:14:26 13:13:01 all 0.36 0.00 0.20 0.01 0.08 99.34 13:14:26 13:13:01 0 0.17 0.00 0.12 0.00 0.05 99.67 13:14:26 13:13:01 1 0.72 0.00 0.33 0.00 0.08 98.86 13:14:26 13:13:01 2 0.25 0.00 0.15 0.02 0.08 99.50 13:14:26 13:13:01 3 0.30 0.00 0.20 0.03 0.12 99.34 13:14:26 13:14:01 all 22.74 0.00 1.44 0.77 0.08 74.97 13:14:26 13:14:01 0 19.36 0.00 1.39 0.20 0.07 78.99 13:14:26 13:14:01 1 17.82 0.00 1.50 0.67 0.08 79.93 13:14:26 13:14:01 2 29.42 0.00 1.44 0.89 0.08 68.17 13:14:26 13:14:01 3 24.37 0.00 1.42 1.34 0.08 72.79 13:14:26 Average: all 13.34 0.15 0.79 0.61 0.09 85.02 13:14:26 Average: 0 13.12 0.13 0.78 0.49 0.09 85.39 13:14:26 Average: 1 13.43 0.14 0.79 0.69 0.09 84.86 13:14:26 Average: 2 13.40 0.15 0.80 0.70 0.08 84.87 13:14:26 Average: 3 13.43 0.17 0.79 0.55 0.09 84.97 13:14:26 13:14:26 13:14:26