Triggered by Gerrit: https://git.opendaylight.org/gerrit/c/transportpce/+/113906 Running as SYSTEM [EnvInject] - Loading node environment variables. Building remotely on prd-ubuntu2004-docker-4c-16g-43136 (ubuntu2004-docker-4c-16g) in workspace /w/workspace/transportpce-tox-verify-transportpce-master [ssh-agent] Looking for ssh-agent implementation... [ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine) $ ssh-agent SSH_AUTH_SOCK=/tmp/ssh-002chT0LMEVC/agent.12643 SSH_AGENT_PID=12646 [ssh-agent] Started. Running ssh-add (command line suppressed) Identity added: /w/workspace/transportpce-tox-verify-transportpce-master@tmp/private_key_15255351946247273321.key (/w/workspace/transportpce-tox-verify-transportpce-master@tmp/private_key_15255351946247273321.key) [ssh-agent] Using credentials jenkins (jenkins-ssh) The recommended git tool is: NONE using credential jenkins-ssh Wiping out workspace first. Cloning the remote Git repository Cloning repository git://devvexx.opendaylight.org/mirror/transportpce > git init /w/workspace/transportpce-tox-verify-transportpce-master # timeout=10 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/transportpce > git --version # timeout=10 > git --version # 'git version 2.25.1' using GIT_SSH to set credentials jenkins-ssh Verifying host key using known hosts file You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/transportpce +refs/heads/*:refs/remotes/origin/* # timeout=10 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/transportpce # timeout=10 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/transportpce # timeout=10 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/transportpce using GIT_SSH to set credentials jenkins-ssh Verifying host key using known hosts file You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/transportpce refs/changes/06/113906/7 # timeout=10 > git rev-parse 3a46cbc94f7426a52fca6c0e33606f3afd566d13^{commit} # timeout=10 JENKINS-19022: warning: possible memory leak due to Git plugin usage; see: https://plugins.jenkins.io/git/#remove-git-plugin-buildsbybranch-builddata-script Checking out Revision 3a46cbc94f7426a52fca6c0e33606f3afd566d13 (refs/changes/06/113906/7) > git config core.sparsecheckout # timeout=10 > git checkout -f 3a46cbc94f7426a52fca6c0e33606f3afd566d13 # timeout=10 Commit message: "Add Tapi Abstracted Node to OR Topo" > git rev-parse FETCH_HEAD^{commit} # timeout=10 > git rev-list --no-walk 9889c236444fdb4bb40abe9dc03a01d4dc69b802 # timeout=10 > git remote # timeout=10 > git submodule init # timeout=10 > git submodule sync # timeout=10 > git config --get remote.origin.url # timeout=10 > git submodule init # timeout=10 > git config -f .gitmodules --get-regexp ^submodule\.(.+)\.url # timeout=10 ERROR: No submodules found. provisioning config files... copy managed file [npmrc] to file:/home/jenkins/.npmrc copy managed file [pipconf] to file:/home/jenkins/.config/pip/pip.conf [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins2296695745776641185.sh ---> python-tools-install.sh Setup pyenv: * system (set by /opt/pyenv/version) * 3.8.13 (set by /opt/pyenv/version) * 3.9.13 (set by /opt/pyenv/version) * 3.10.13 (set by /opt/pyenv/version) * 3.11.7 (set by /opt/pyenv/version) lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-IHAD lf-activate-venv(): INFO: Save venv in file: /tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools lf-activate-venv(): INFO: Adding /tmp/venv-IHAD/bin to PATH Generating Requirements File Python 3.11.7 pip 24.2 from /tmp/venv-IHAD/lib/python3.11/site-packages/pip (python 3.11) appdirs==1.4.4 argcomplete==3.5.1 aspy.yaml==1.3.0 attrs==24.2.0 autopage==0.5.2 beautifulsoup4==4.12.3 boto3==1.35.41 botocore==1.35.41 bs4==0.0.2 cachetools==5.5.0 certifi==2024.8.30 cffi==1.17.1 cfgv==3.4.0 chardet==5.2.0 charset-normalizer==3.4.0 click==8.1.7 cliff==4.7.0 cmd2==2.4.3 cryptography==3.3.2 debtcollector==3.0.0 decorator==5.1.1 defusedxml==0.7.1 Deprecated==1.2.14 distlib==0.3.9 dnspython==2.7.0 docker==4.2.2 dogpile.cache==1.3.3 durationpy==0.9 email_validator==2.2.0 filelock==3.16.1 future==1.0.0 gitdb==4.0.11 GitPython==3.1.43 google-auth==2.35.0 httplib2==0.22.0 identify==2.6.1 idna==3.10 importlib-resources==1.5.0 iso8601==2.1.0 Jinja2==3.1.4 jmespath==1.0.1 jsonpatch==1.33 jsonpointer==3.0.0 jsonschema==4.23.0 jsonschema-specifications==2024.10.1 keystoneauth1==5.8.0 kubernetes==31.0.0 lftools==0.37.10 lxml==5.3.0 MarkupSafe==3.0.1 msgpack==1.1.0 multi_key_dict==2.0.3 munch==4.0.0 netaddr==1.3.0 netifaces==0.11.0 niet==1.4.2 nodeenv==1.9.1 oauth2client==4.1.3 oauthlib==3.2.2 openstacksdk==4.0.0 os-client-config==2.1.0 os-service-types==1.7.0 osc-lib==3.1.0 oslo.config==9.6.0 oslo.context==5.6.0 oslo.i18n==6.4.0 oslo.log==6.1.2 oslo.serialization==5.5.0 oslo.utils==7.3.0 packaging==24.1 pbr==6.1.0 platformdirs==4.3.6 prettytable==3.11.0 pyasn1==0.6.1 pyasn1_modules==0.4.1 pycparser==2.22 pygerrit2==2.0.15 PyGithub==2.4.0 PyJWT==2.9.0 PyNaCl==1.5.0 pyparsing==2.4.7 pyperclip==1.9.0 pyrsistent==0.20.0 python-cinderclient==9.6.0 python-dateutil==2.9.0.post0 python-heatclient==4.0.0 python-jenkins==1.8.2 python-keystoneclient==5.5.0 python-magnumclient==4.7.0 python-openstackclient==7.1.3 python-swiftclient==4.6.0 PyYAML==6.0.2 referencing==0.35.1 requests==2.32.3 requests-oauthlib==2.0.0 requestsexceptions==1.4.0 rfc3986==2.0.0 rpds-py==0.20.0 rsa==4.9 ruamel.yaml==0.18.6 ruamel.yaml.clib==0.2.8 s3transfer==0.10.3 simplejson==3.19.3 six==1.16.0 smmap==5.0.1 soupsieve==2.6 stevedore==5.3.0 tabulate==0.9.0 toml==0.10.2 tomlkit==0.13.2 tqdm==4.66.5 typing_extensions==4.12.2 tzdata==2024.2 urllib3==1.26.20 virtualenv==20.26.6 wcwidth==0.2.13 websocket-client==1.8.0 wrapt==1.16.0 xdg==6.0.0 xmltodict==0.14.2 yq==3.4.3 [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content PYTHON=python3 [EnvInject] - Variables injected successfully. [transportpce-tox-verify-transportpce-master] $ /bin/bash -l /tmp/jenkins8006703684608994767.sh ---> tox-install.sh + source /home/jenkins/lf-env.sh + lf-activate-venv --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 ++ mktemp -d /tmp/venv-XXXX + lf_venv=/tmp/venv-CTaP + local venv_file=/tmp/.os_lf_venv + local python=python3 + local options + local set_path=true + local install_args= ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 + options=' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' + eval set -- ' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' ++ set -- --venv-file /tmp/.toxenv -- tox virtualenv urllib3~=1.26.15 + true + case $1 in + venv_file=/tmp/.toxenv + shift 2 + true + case $1 in + shift + break + case $python in + local pkg_list= + [[ -d /opt/pyenv ]] + echo 'Setup pyenv:' Setup pyenv: + export PYENV_ROOT=/opt/pyenv + PYENV_ROOT=/opt/pyenv + export PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + pyenv versions system 3.8.13 3.9.13 3.10.13 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) + command -v pyenv ++ pyenv init - --no-rehash + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); for i in ${!paths[@]}; do if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; fi; done; echo "${paths[*]}"'\'')" export PATH="/opt/pyenv/shims:${PATH}" export PYENV_SHELL=bash source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' pyenv() { local command command="${1:-}" if [ "$#" -gt 0 ]; then shift fi case "$command" in rehash|shell) eval "$(pyenv "sh-$command" "$@")" ;; *) command pyenv "$command" "$@" ;; esac }' +++ bash --norc -ec 'IFS=:; paths=($PATH); for i in ${!paths[@]}; do if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; fi; done; echo "${paths[*]}"' ++ PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ export PYENV_SHELL=bash ++ PYENV_SHELL=bash ++ source /opt/pyenv/libexec/../completions/pyenv.bash +++ complete -F _pyenv pyenv ++ lf-pyver python3 ++ local py_version_xy=python3 ++ local py_version_xyz= ++ pyenv versions ++ local command ++ command=versions ++ '[' 1 -gt 0 ']' ++ shift ++ case "$command" in ++ command pyenv versions ++ pyenv versions ++ grep -E '^[0-9.]*[0-9]$' ++ awk '{ print $1 }' ++ sed 's/^[ *]* //' ++ [[ ! -s /tmp/.pyenv_versions ]] +++ grep '^3' /tmp/.pyenv_versions +++ sort -V +++ tail -n 1 ++ py_version_xyz=3.11.7 ++ [[ -z 3.11.7 ]] ++ echo 3.11.7 ++ return 0 + pyenv local 3.11.7 + local command + command=local + '[' 2 -gt 0 ']' + shift + case "$command" in + command pyenv local 3.11.7 + pyenv local 3.11.7 + for arg in "$@" + case $arg in + pkg_list+='tox ' + for arg in "$@" + case $arg in + pkg_list+='virtualenv ' + for arg in "$@" + case $arg in + pkg_list+='urllib3~=1.26.15 ' + [[ -f /tmp/.toxenv ]] + [[ ! -f /tmp/.toxenv ]] + [[ -n '' ]] + python3 -m venv /tmp/venv-CTaP + echo 'lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-CTaP' lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-CTaP + echo /tmp/venv-CTaP + echo 'lf-activate-venv(): INFO: Save venv in file: /tmp/.toxenv' lf-activate-venv(): INFO: Save venv in file: /tmp/.toxenv + /tmp/venv-CTaP/bin/python3 -m pip install --upgrade --quiet pip virtualenv + [[ -z tox virtualenv urllib3~=1.26.15 ]] + echo 'lf-activate-venv(): INFO: Installing: tox virtualenv urllib3~=1.26.15 ' lf-activate-venv(): INFO: Installing: tox virtualenv urllib3~=1.26.15 + /tmp/venv-CTaP/bin/python3 -m pip install --upgrade --quiet --upgrade-strategy eager tox virtualenv urllib3~=1.26.15 + type python3 + true + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-CTaP/bin to PATH' lf-activate-venv(): INFO: Adding /tmp/venv-CTaP/bin to PATH + PATH=/tmp/venv-CTaP/bin:/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + return 0 + python3 --version Python 3.11.7 + python3 -m pip --version pip 24.2 from /tmp/venv-CTaP/lib/python3.11/site-packages/pip (python 3.11) + python3 -m pip freeze cachetools==5.5.0 chardet==5.2.0 colorama==0.4.6 distlib==0.3.9 filelock==3.16.1 packaging==24.1 platformdirs==4.3.6 pluggy==1.5.0 pyproject-api==1.8.0 tox==4.22.0 urllib3==1.26.20 virtualenv==20.26.6 [transportpce-tox-verify-transportpce-master] $ /bin/sh -xe /tmp/jenkins7870444051426149305.sh [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content PARALLEL=True [EnvInject] - Variables injected successfully. [transportpce-tox-verify-transportpce-master] $ /bin/bash -l /tmp/jenkins11392016447314697376.sh ---> tox-run.sh + PATH=/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + ARCHIVE_TOX_DIR=/w/workspace/transportpce-tox-verify-transportpce-master/archives/tox + ARCHIVE_DOC_DIR=/w/workspace/transportpce-tox-verify-transportpce-master/archives/docs + mkdir -p /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox + cd /w/workspace/transportpce-tox-verify-transportpce-master/. + source /home/jenkins/lf-env.sh + lf-activate-venv --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 ++ mktemp -d /tmp/venv-XXXX + lf_venv=/tmp/venv-TRDw + local venv_file=/tmp/.os_lf_venv + local python=python3 + local options + local set_path=true + local install_args= ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 + options=' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' + eval set -- ' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' ++ set -- --venv-file /tmp/.toxenv -- tox virtualenv urllib3~=1.26.15 + true + case $1 in + venv_file=/tmp/.toxenv + shift 2 + true + case $1 in + shift + break + case $python in + local pkg_list= + [[ -d /opt/pyenv ]] + echo 'Setup pyenv:' Setup pyenv: + export PYENV_ROOT=/opt/pyenv + PYENV_ROOT=/opt/pyenv + export PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + pyenv versions system 3.8.13 3.9.13 3.10.13 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) + command -v pyenv ++ pyenv init - --no-rehash + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); for i in ${!paths[@]}; do if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; fi; done; echo "${paths[*]}"'\'')" export PATH="/opt/pyenv/shims:${PATH}" export PYENV_SHELL=bash source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' pyenv() { local command command="${1:-}" if [ "$#" -gt 0 ]; then shift fi case "$command" in rehash|shell) eval "$(pyenv "sh-$command" "$@")" ;; *) command pyenv "$command" "$@" ;; esac }' +++ bash --norc -ec 'IFS=:; paths=($PATH); for i in ${!paths[@]}; do if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; fi; done; echo "${paths[*]}"' ++ PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ export PYENV_SHELL=bash ++ PYENV_SHELL=bash ++ source /opt/pyenv/libexec/../completions/pyenv.bash +++ complete -F _pyenv pyenv ++ lf-pyver python3 ++ local py_version_xy=python3 ++ local py_version_xyz= ++ pyenv versions ++ local command ++ command=versions ++ '[' 1 -gt 0 ']' ++ shift ++ case "$command" in ++ command pyenv versions ++ pyenv versions ++ awk '{ print $1 }' ++ grep -E '^[0-9.]*[0-9]$' ++ sed 's/^[ *]* //' ++ [[ ! -s /tmp/.pyenv_versions ]] +++ grep '^3' /tmp/.pyenv_versions +++ sort -V +++ tail -n 1 ++ py_version_xyz=3.11.7 ++ [[ -z 3.11.7 ]] ++ echo 3.11.7 ++ return 0 + pyenv local 3.11.7 + local command + command=local + '[' 2 -gt 0 ']' + shift + case "$command" in + command pyenv local 3.11.7 + pyenv local 3.11.7 + for arg in "$@" + case $arg in + pkg_list+='tox ' + for arg in "$@" + case $arg in + pkg_list+='virtualenv ' + for arg in "$@" + case $arg in + pkg_list+='urllib3~=1.26.15 ' + [[ -f /tmp/.toxenv ]] ++ cat /tmp/.toxenv + lf_venv=/tmp/venv-CTaP + echo 'lf-activate-venv(): INFO: Reuse venv:/tmp/venv-CTaP from' file:/tmp/.toxenv lf-activate-venv(): INFO: Reuse venv:/tmp/venv-CTaP from file:/tmp/.toxenv + /tmp/venv-CTaP/bin/python3 -m pip install --upgrade --quiet pip virtualenv + [[ -z tox virtualenv urllib3~=1.26.15 ]] + echo 'lf-activate-venv(): INFO: Installing: tox virtualenv urllib3~=1.26.15 ' lf-activate-venv(): INFO: Installing: tox virtualenv urllib3~=1.26.15 + /tmp/venv-CTaP/bin/python3 -m pip install --upgrade --quiet --upgrade-strategy eager tox virtualenv urllib3~=1.26.15 + type python3 + true + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-CTaP/bin to PATH' lf-activate-venv(): INFO: Adding /tmp/venv-CTaP/bin to PATH + PATH=/tmp/venv-CTaP/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + return 0 + [[ -d /opt/pyenv ]] + echo '---> Setting up pyenv' ---> Setting up pyenv + export PYENV_ROOT=/opt/pyenv + PYENV_ROOT=/opt/pyenv + export PATH=/opt/pyenv/bin:/tmp/venv-CTaP/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + PATH=/opt/pyenv/bin:/tmp/venv-CTaP/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ pwd + PYTHONPATH=/w/workspace/transportpce-tox-verify-transportpce-master + export PYTHONPATH + export TOX_TESTENV_PASSENV=PYTHONPATH + TOX_TESTENV_PASSENV=PYTHONPATH + tox --version 4.22.0 from /tmp/venv-CTaP/lib/python3.11/site-packages/tox/__init__.py + PARALLEL=True + TOX_OPTIONS_LIST= + [[ -n '' ]] + case ${PARALLEL,,} in + TOX_OPTIONS_LIST=' --parallel auto --parallel-live' + tox --parallel auto --parallel-live + tee -a /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tox.log docs: install_deps> python -I -m pip install -r docs/requirements.txt buildcontroller: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt checkbashisms: freeze> python -m pip freeze --all docs-linkcheck: install_deps> python -I -m pip install -r docs/requirements.txt checkbashisms: pip==24.2,setuptools==75.1.0,wheel==0.44.0 checkbashisms: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./fixCIcentOS8reposMirrors.sh checkbashisms: commands[1] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sh -c 'command checkbashisms>/dev/null || sudo yum install -y devscripts-checkbashisms || sudo yum install -y devscripts-minimal || sudo yum install -y devscripts || sudo yum install -y https://archives.fedoraproject.org/pub/archive/fedora/linux/releases/31/Everything/x86_64/os/Packages/d/devscripts-checkbashisms-2.19.6-2.fc31.x86_64.rpm || (echo "checkbashisms command not found - please install it (e.g. sudo apt-get install devscripts | yum install devscripts-minimal )" >&2 && exit 1)' checkbashisms: commands[2] /w/workspace/transportpce-tox-verify-transportpce-master/tests> find . -not -path '*/\.*' -name '*.sh' -exec checkbashisms -f '{}' + script ./reflectwarn.sh does not appear to have a #! interpreter line; you may get strange results checkbashisms: OK ✔ in 3 seconds pre-commit: install_deps> python -I -m pip install pre-commit pre-commit: freeze> python -m pip freeze --all pre-commit: cfgv==3.4.0,distlib==0.3.9,filelock==3.16.1,identify==2.6.1,nodeenv==1.9.1,pip==24.2,platformdirs==4.3.6,pre_commit==4.0.1,PyYAML==6.0.2,setuptools==75.1.0,virtualenv==20.26.6,wheel==0.44.0 pre-commit: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./fixCIcentOS8reposMirrors.sh pre-commit: commands[1] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sh -c 'which cpan || sudo yum install -y perl-CPAN || (echo "cpan command not found - please install it (e.g. sudo apt-get install perl-modules | yum install perl-CPAN )" >&2 && exit 1)' /usr/bin/cpan pre-commit: commands[2] /w/workspace/transportpce-tox-verify-transportpce-master/tests> pre-commit run --all-files --show-diff-on-failure [WARNING] hook id `remove-tabs` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. [WARNING] hook id `perltidy` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. [INFO] Initializing environment for https://github.com/pre-commit/pre-commit-hooks. [WARNING] repo `https://github.com/pre-commit/pre-commit-hooks` uses deprecated stage names (commit, push) which will be removed in a future version. Hint: often `pre-commit autoupdate --repo https://github.com/pre-commit/pre-commit-hooks` will fix this. if it does not -- consider reporting an issue to that repo. [INFO] Initializing environment for https://github.com/jorisroovers/gitlint. [INFO] Initializing environment for https://github.com/jorisroovers/gitlint:./gitlint-core[trusted-deps]. buildcontroller: freeze> python -m pip freeze --all [INFO] Initializing environment for https://github.com/Lucas-C/pre-commit-hooks. buildcontroller: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 buildcontroller: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_controller.sh + update-java-alternatives -l java-1.11.0-openjdk-amd64 1111 /usr/lib/jvm/java-1.11.0-openjdk-amd64 java-1.12.0-openjdk-amd64 1211 /usr/lib/jvm/java-1.12.0-openjdk-amd64 java-1.17.0-openjdk-amd64 1711 /usr/lib/jvm/java-1.17.0-openjdk-amd64 java-1.21.0-openjdk-amd64 2111 /usr/lib/jvm/java-1.21.0-openjdk-amd64 java-1.8.0-openjdk-amd64 1081 /usr/lib/jvm/java-1.8.0-openjdk-amd64 + sudo update-java-alternatives -s java-1.21.0-openjdk-amd64 [INFO] Initializing environment for https://github.com/pre-commit/mirrors-autopep8. [INFO] Initializing environment for https://github.com/perltidy/perltidy. [INFO] Installing environment for https://github.com/pre-commit/pre-commit-hooks. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... + + sed -n ;s/.* version "\(.*\)\.\(.*\)\..*".*$/\1/p; java -version + JAVA_VER=21 + echo 21 21 + javac -version + sed -n ;s/javac \(.*\)\.\(.*\)\..*.*$/\1/p; + JAVAC_VER=21 + echo 21 21 ok, java is 21 or newer + [ 21 -ge 21 ] + [ 21 -ge 21 ] + echo ok, java is 21 or newer + wget -nv https://dlcdn.apache.org/maven/maven-3/3.9.8/binaries/apache-maven-3.9.8-bin.tar.gz -P /tmp 2024-10-16 14:14:53 URL:https://dlcdn.apache.org/maven/maven-3/3.9.8/binaries/apache-maven-3.9.8-bin.tar.gz [9083702/9083702] -> "/tmp/apache-maven-3.9.8-bin.tar.gz" [1] + sudo mkdir -p /opt + sudo tar xf /tmp/apache-maven-3.9.8-bin.tar.gz -C /opt + sudo ln -s /opt/apache-maven-3.9.8 /opt/maven + sudo ln -s /opt/maven/bin/mvn /usr/bin/mvn + mvn --version Apache Maven 3.9.8 (36645f6c9b5079805ea5009217e36f2cffd34256) Maven home: /opt/maven Java version: 21.0.4, vendor: Ubuntu, runtime: /usr/lib/jvm/java-21-openjdk-amd64 Default locale: en, platform encoding: UTF-8 OS name: "linux", version: "5.4.0-190-generic", arch: "amd64", family: "unix" NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.nio.charset=ALL-UNNAMED --add-opens=java.base/java.nio.file=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.jar=ALL-UNNAMED --add-opens=java.base/java.util.stream=ALL-UNNAMED --add-opens=java.base/java.util.zip=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/sun.nio.fs=ALL-UNNAMED -Xlog:disable [INFO] Installing environment for https://github.com/Lucas-C/pre-commit-hooks. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... [INFO] Installing environment for https://github.com/pre-commit/mirrors-autopep8. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... [INFO] Installing environment for https://github.com/perltidy/perltidy. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... docs: freeze> python -m pip freeze --all docs-linkcheck: freeze> python -m pip freeze --all docs: alabaster==1.0.0,attrs==24.2.0,babel==2.16.0,blockdiag==3.0.0,certifi==2024.8.30,charset-normalizer==3.4.0,contourpy==1.3.0,cycler==0.12.1,docutils==0.21.2,fonttools==4.54.1,funcparserlib==2.0.0a0,future==1.0.0,idna==3.10,imagesize==1.4.1,Jinja2==3.1.4,jsonschema==3.2.0,kiwisolver==1.4.7,lfdocs-conf==0.9.0,MarkupSafe==3.0.1,matplotlib==3.9.2,numpy==2.1.2,nwdiag==3.0.0,packaging==24.1,pillow==11.0.0,pip==24.2,Pygments==2.18.0,pyparsing==3.2.0,pyrsistent==0.20.0,python-dateutil==2.9.0.post0,PyYAML==6.0.2,requests==2.32.3,requests-file==1.5.1,seqdiag==3.0.0,setuptools==75.1.0,six==1.16.0,snowballstemmer==2.2.0,Sphinx==8.1.3,sphinx-bootstrap-theme==0.8.1,sphinx-data-viewer==0.1.5,sphinx-rtd-theme==3.0.1,sphinx-tabs==3.4.7,sphinxcontrib-applehelp==2.0.0,sphinxcontrib-blockdiag==3.0.0,sphinxcontrib-devhelp==2.0.0,sphinxcontrib-htmlhelp==2.1.0,sphinxcontrib-jquery==4.1,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-needs==0.7.9,sphinxcontrib-nwdiag==2.0.0,sphinxcontrib-plantuml==0.30,sphinxcontrib-qthelp==2.0.0,sphinxcontrib-seqdiag==3.0.0,sphinxcontrib-serializinghtml==2.0.0,sphinxcontrib-swaggerdoc==0.1.7,urllib3==2.2.3,webcolors==24.8.0,wheel==0.44.0 docs: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sphinx-build -q -W --keep-going -b html -n -d /w/workspace/transportpce-tox-verify-transportpce-master/.tox/docs/tmp/doctrees ../docs/ /w/workspace/transportpce-tox-verify-transportpce-master/docs/_build/html docs-linkcheck: alabaster==1.0.0,attrs==24.2.0,babel==2.16.0,blockdiag==3.0.0,certifi==2024.8.30,charset-normalizer==3.4.0,contourpy==1.3.0,cycler==0.12.1,docutils==0.21.2,fonttools==4.54.1,funcparserlib==2.0.0a0,future==1.0.0,idna==3.10,imagesize==1.4.1,Jinja2==3.1.4,jsonschema==3.2.0,kiwisolver==1.4.7,lfdocs-conf==0.9.0,MarkupSafe==3.0.1,matplotlib==3.9.2,numpy==2.1.2,nwdiag==3.0.0,packaging==24.1,pillow==11.0.0,pip==24.2,Pygments==2.18.0,pyparsing==3.2.0,pyrsistent==0.20.0,python-dateutil==2.9.0.post0,PyYAML==6.0.2,requests==2.32.3,requests-file==1.5.1,seqdiag==3.0.0,setuptools==75.1.0,six==1.16.0,snowballstemmer==2.2.0,Sphinx==8.1.3,sphinx-bootstrap-theme==0.8.1,sphinx-data-viewer==0.1.5,sphinx-rtd-theme==3.0.1,sphinx-tabs==3.4.7,sphinxcontrib-applehelp==2.0.0,sphinxcontrib-blockdiag==3.0.0,sphinxcontrib-devhelp==2.0.0,sphinxcontrib-htmlhelp==2.1.0,sphinxcontrib-jquery==4.1,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-needs==0.7.9,sphinxcontrib-nwdiag==2.0.0,sphinxcontrib-plantuml==0.30,sphinxcontrib-qthelp==2.0.0,sphinxcontrib-seqdiag==3.0.0,sphinxcontrib-serializinghtml==2.0.0,sphinxcontrib-swaggerdoc==0.1.7,urllib3==2.2.3,webcolors==24.8.0,wheel==0.44.0 docs-linkcheck: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sphinx-build -q -b linkcheck -d /w/workspace/transportpce-tox-verify-transportpce-master/.tox/docs-linkcheck/tmp/doctrees ../docs/ /w/workspace/transportpce-tox-verify-transportpce-master/docs/_build/linkcheck trim trailing whitespace.................................................Passed Tabs remover.............................................................Passed autopep8.................................................................docs: OK ✔ in 41.99 seconds pylint: install_deps> python -I -m pip install 'pylint>=2.6.0' Failed - hook id: autopep8 - files were modified by this hook perltidy.................................................................Passed pre-commit hook(s) made changes. If you are seeing this message in CI, reproduce locally with: `pre-commit run --all-files`. To run `pre-commit` as part of git workflow, use `pre-commit install`. All changes made by hooks: diff --git a/tests/transportpce_tests/1.2.1/test02_topo_portmapping.py b/tests/transportpce_tests/1.2.1/test02_topo_portmapping.py index b0403c0b..9773dd08 100644 --- a/tests/transportpce_tests/1.2.1/test02_topo_portmapping.py +++ b/tests/transportpce_tests/1.2.1/test02_topo_portmapping.py @@ -56,7 +56,7 @@ class TransportPCEtesting(unittest.TestCase): for node in resTopo['network'][0]['node']: nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue response = test_utils.get_portmapping_node_attr(nodeMapId, "node-info", None) self.assertEqual(response['status_code'], requests.codes.ok) diff --git a/tests/transportpce_tests/1.2.1/test03_topology.py b/tests/transportpce_tests/1.2.1/test03_topology.py index 4cac2581..df42f53e 100644 --- a/tests/transportpce_tests/1.2.1/test03_topology.py +++ b/tests/transportpce_tests/1.2.1/test03_topology.py @@ -165,7 +165,7 @@ class TransportPCETopologyTesting(unittest.TestCase): for node in response['network'][0]['node']: nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue nodeType = node['org-openroadm-common-network:node-type'] self.assertIn({'network-ref': 'openroadm-network', 'node-ref': 'ROADMA01'}, node['supporting-node']) @@ -199,7 +199,7 @@ class TransportPCETopologyTesting(unittest.TestCase): for node in response['network'][0]['node']: nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue self.assertEqual(node['supporting-node'][0]['network-ref'], 'clli-network') self.assertEqual(node['supporting-node'][0]['node-ref'], 'NodeA') @@ -221,7 +221,7 @@ class TransportPCETopologyTesting(unittest.TestCase): for node in response['network'][0]['node']: nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue nodeType = node['org-openroadm-common-network:node-type'] # Tests related to XPDRA nodes @@ -361,7 +361,7 @@ class TransportPCETopologyTesting(unittest.TestCase): self.assertEqual(node['supporting-node'][0]['network-ref'], 'clli-network') nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue self.assertIn(nodeId, CHECK_LIST) self.assertEqual(node['supporting-node'][0]['node-ref'], @@ -438,7 +438,7 @@ class TransportPCETopologyTesting(unittest.TestCase): for node in response['network'][0]['node']: nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue nodeType = node['org-openroadm-common-network:node-type'] # Tests related to XPDRA nodes @@ -550,7 +550,7 @@ class TransportPCETopologyTesting(unittest.TestCase): for node in response['network'][0]['node']: nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue self.assertIn(nodeId, listNode) self.assertEqual(node['org-openroadm-clli-network:clli'], nodeId) @@ -643,7 +643,7 @@ class TransportPCETopologyTesting(unittest.TestCase): for node in response['network'][0]['node']: nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue nodeType = node['org-openroadm-common-network:node-type'] # Tests related to XPDRA nodes @@ -716,7 +716,7 @@ class TransportPCETopologyTesting(unittest.TestCase): for node in response['network'][0]['node']: nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue self.assertIn({'network-ref': 'openroadm-network', 'node-ref': 'ROADMA01'}, node['supporting-node']) nodeType = node['org-openroadm-common-network:node-type'] diff --git a/tests/transportpce_tests/2.2.1/test02_topo_portmapping.py b/tests/transportpce_tests/2.2.1/test02_topo_portmapping.py index 68699b00..e0a1ffaf 100644 --- a/tests/transportpce_tests/2.2.1/test02_topo_portmapping.py +++ b/tests/transportpce_tests/2.2.1/test02_topo_portmapping.py @@ -59,7 +59,7 @@ class TransportPCEtesting(unittest.TestCase): # pylint: disable=consider-using-f-string print("nodeId={}".format(nodeId)) nodeMapId = nodeId.split("-")[0] + "-" + nodeId.split("-")[1] - if (nodeMapId == 'TAPI-SBI') : + if (nodeMapId == 'TAPI-SBI'): continue print("nodeMapId={}".format(nodeMapId)) response = test_utils.get_portmapping_node_attr(nodeMapId, "node-info", None) diff --git a/tests/transportpce_tests/2.2.1/test03_topology.py b/tests/transportpce_tests/2.2.1/test03_topology.py index 97c9a1fb..6bbf7c3e 100644 --- a/tests/transportpce_tests/2.2.1/test03_topology.py +++ b/tests/transportpce_tests/2.2.1/test03_topology.py @@ -168,7 +168,7 @@ class TransportPCEtesting(unittest.TestCase): for node in response['network'][0]['node']: nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue nodeType = node['org-openroadm-common-network:node-type'] self.assertIn({'network-ref': 'openroadm-network', 'node-ref': 'ROADM-A1'}, node['supporting-node']) @@ -202,7 +202,7 @@ class TransportPCEtesting(unittest.TestCase): for node in response['network'][0]['node']: nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue self.assertEqual(node['supporting-node'][0]['network-ref'], 'clli-network') self.assertEqual(node['supporting-node'][0]['node-ref'], 'NodeA') @@ -224,7 +224,7 @@ class TransportPCEtesting(unittest.TestCase): for node in response['network'][0]['node']: nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue nodeType = node['org-openroadm-common-network:node-type'] # Tests related to XPDRA nodes @@ -367,7 +367,7 @@ class TransportPCEtesting(unittest.TestCase): self.assertEqual(node['supporting-node'][0]['network-ref'], 'clli-network') nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue if nodeId in CHECK_LIST: self.assertEqual(node['supporting-node'][0]['node-ref'], CHECK_LIST[nodeId]['node-ref']) @@ -446,7 +446,7 @@ class TransportPCEtesting(unittest.TestCase): for node in response['network'][0]['node']: nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue nodeType = node['org-openroadm-common-network:node-type'] if nodeId == 'XPDR-A1-XPDR1': @@ -563,7 +563,7 @@ class TransportPCEtesting(unittest.TestCase): for node in response['network'][0]['node']: nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue self.assertIn(nodeId, listNode) self.assertEqual(node['org-openroadm-clli-network:clli'], nodeId) @@ -657,7 +657,7 @@ class TransportPCEtesting(unittest.TestCase): for node in response['network'][0]['node']: nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue nodeType = node['org-openroadm-common-network:node-type'] # Tests related to XPDRA nodes @@ -734,7 +734,7 @@ class TransportPCEtesting(unittest.TestCase): for node in response['network'][0]['node']: nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue self.assertIn({'network-ref': 'openroadm-network', 'node-ref': 'ROADM-A1'}, node['supporting-node']) nodeType = node['org-openroadm-common-network:node-type'] diff --git a/tests/transportpce_tests/2.2.1/test04_otn_topology.py b/tests/transportpce_tests/2.2.1/test04_otn_topology.py index f1d1ec77..3a34bf5f 100644 --- a/tests/transportpce_tests/2.2.1/test04_otn_topology.py +++ b/tests/transportpce_tests/2.2.1/test04_otn_topology.py @@ -86,7 +86,7 @@ class TransportPCEtesting(unittest.TestCase): for node in response['network'][0]['node']: nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue self.assertIn({'network-ref': 'openroadm-network', 'node-ref': 'SPDR-SA1'}, node['supporting-node']) self.assertIn({'network-ref': 'clli-network', 'node-ref': 'NodeSA'}, node['supporting-node']) @@ -150,7 +150,7 @@ class TransportPCEtesting(unittest.TestCase): for node in response['network'][0]['node']: nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue if nodeId in CHECK_LIST: self.assertEqual(node['org-openroadm-common-network:node-type'], CHECK_LIST[nodeId]['node-type']) diff --git a/tests/transportpce_tests/hybrid/test01_device_change_notifications.py b/tests/transportpce_tests/hybrid/test01_device_change_notifications.py index ed8593da..96865e50 100644 --- a/tests/transportpce_tests/hybrid/test01_device_change_notifications.py +++ b/tests/transportpce_tests/hybrid/test01_device_change_notifications.py @@ -235,7 +235,7 @@ class TransportPCEFulltesting(unittest.TestCase): self.assertEqual(node['org-openroadm-common-network:administrative-state'], 'inService') nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue tp_list = node['ietf-network-topology:termination-point'] for tp in tp_list: @@ -301,7 +301,7 @@ class TransportPCEFulltesting(unittest.TestCase): self.assertEqual(node['org-openroadm-common-network:operational-state'], 'inService') self.assertEqual(node['org-openroadm-common-network:administrative-state'], 'inService') nodeMapId = node['node-id'].split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue tp_list = node['ietf-network-topology:termination-point'] for tp in tp_list: @@ -356,7 +356,7 @@ class TransportPCEFulltesting(unittest.TestCase): self.assertEqual(node['org-openroadm-common-network:administrative-state'], 'inService') nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue tp_list = node['ietf-network-topology:termination-point'] for tp in tp_list: @@ -444,7 +444,7 @@ class TransportPCEFulltesting(unittest.TestCase): self.assertEqual(node['org-openroadm-common-network:administrative-state'], 'inService') nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue tp_list = node['ietf-network-topology:termination-point'] for tp in tp_list: @@ -530,7 +530,7 @@ class TransportPCEFulltesting(unittest.TestCase): self.assertEqual(node['org-openroadm-common-network:administrative-state'], 'inService') nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue tp_list = node['ietf-network-topology:termination-point'] for tp in tp_list: @@ -616,7 +616,7 @@ class TransportPCEFulltesting(unittest.TestCase): self.assertEqual(node['org-openroadm-common-network:administrative-state'], 'inService') nodeId = node['node-id'] nodeMapId = nodeId.split("-")[0] - if (nodeMapId == 'TAPI') : + if (nodeMapId == 'TAPI'): continue tp_list = node['ietf-network-topology:termination-point'] for tp in tp_list: docs-linkcheck: OK ✔ in 43.18 seconds pre-commit: exit 1 (39.46 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> pre-commit run --all-files --show-diff-on-failure pid=29116 pre-commit: FAIL ✖ in 43.1 seconds pylint: freeze> python -m pip freeze --all pylint: astroid==3.3.5,dill==0.3.9,isort==5.13.2,mccabe==0.7.0,pip==24.2,platformdirs==4.3.6,pylint==3.3.1,setuptools==75.1.0,tomlkit==0.13.2,wheel==0.44.0 pylint: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> find transportpce_tests/ -name '*.py' -exec pylint --fail-under=10 --max-line-length=120 --disable=missing-docstring,import-error --disable=fixme --disable=duplicate-code '--module-rgx=([a-z0-9_]+$)|([0-9.]{1,30}$)' '--method-rgx=(([a-z_][a-zA-Z0-9_]{2,})|(_[a-z0-9_]*)|(__[a-zA-Z][a-zA-Z0-9_]+__))$' '--variable-rgx=[a-zA-Z_][a-zA-Z0-9_]{1,30}$' '{}' + ************* Module 1.2.1.test03_topology transportpce_tests/1.2.1/test03_topology.py:168:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/1.2.1/test03_topology.py:202:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/1.2.1/test03_topology.py:224:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/1.2.1/test03_topology.py:364:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/1.2.1/test03_topology.py:441:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/1.2.1/test03_topology.py:553:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/1.2.1/test03_topology.py:646:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/1.2.1/test03_topology.py:719:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/1.2.1/test03_topology.py:430:4: R0912: Too many branches (13/12) (too-many-branches) ************* Module 1.2.1.test02_topo_portmapping transportpce_tests/1.2.1/test02_topo_portmapping.py:59:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) ************* Module hybrid.test01_device_change_notifications transportpce_tests/hybrid/test01_device_change_notifications.py:238:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/hybrid/test01_device_change_notifications.py:304:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/hybrid/test01_device_change_notifications.py:359:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/hybrid/test01_device_change_notifications.py:447:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/hybrid/test01_device_change_notifications.py:533:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/hybrid/test01_device_change_notifications.py:619:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) ************* Module 2.2.1.test03_topology transportpce_tests/2.2.1/test03_topology.py:171:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/2.2.1/test03_topology.py:205:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/2.2.1/test03_topology.py:227:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/2.2.1/test03_topology.py:370:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/2.2.1/test03_topology.py:449:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/2.2.1/test03_topology.py:566:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/2.2.1/test03_topology.py:660:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/2.2.1/test03_topology.py:737:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) ************* Module 2.2.1.test04_otn_topology transportpce_tests/2.2.1/test04_otn_topology.py:89:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/2.2.1/test04_otn_topology.py:153:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) transportpce_tests/2.2.1/test04_otn_topology.py:120:4: R0912: Too many branches (13/12) (too-many-branches) ************* Module 2.2.1.test02_topo_portmapping transportpce_tests/2.2.1/test02_topo_portmapping.py:62:0: C0325: Unnecessary parens after 'if' keyword (superfluous-parens) ----------------------------------- Your code has been rated at 9.97/10 pylint: exit 1 (20.69 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> find transportpce_tests/ -name '*.py' -exec pylint --fail-under=10 --max-line-length=120 --disable=missing-docstring,import-error --disable=fixme --disable=duplicate-code '--module-rgx=([a-z0-9_]+$)|([0-9.]{1,30}$)' '--method-rgx=(([a-z_][a-zA-Z0-9_]{2,})|(_[a-z0-9_]*)|(__[a-zA-Z][a-zA-Z0-9_]+__))$' '--variable-rgx=[a-zA-Z_][a-zA-Z0-9_]{1,30}$' '{}' + pid=30052 pylint: FAIL ✖ in 25.64 seconds buildcontroller: OK ✔ in 1 minute 53.52 seconds build_karaf_tests221: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt testsPCE: install_deps> python -I -m pip install gnpy4tpce==2.4.7 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt sims: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt build_karaf_tests121: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt build_karaf_tests221: freeze> python -m pip freeze --all build_karaf_tests121: freeze> python -m pip freeze --all sims: freeze> python -m pip freeze --all build_karaf_tests221: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 build_karaf_tests221: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.nio.charset=ALL-UNNAMED --add-opens=java.base/java.nio.file=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.jar=ALL-UNNAMED --add-opens=java.base/java.util.stream=ALL-UNNAMED --add-opens=java.base/java.util.zip=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/sun.nio.fs=ALL-UNNAMED -Xlog:disable build_karaf_tests121: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 build_karaf_tests121: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh sims: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 sims: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./install_lightynode.sh Using lighynode version 20.1.0.2 Installing lightynode device to ./lightynode/lightynode-openroadm-device directory NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.nio.charset=ALL-UNNAMED --add-opens=java.base/java.nio.file=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.jar=ALL-UNNAMED --add-opens=java.base/java.util.stream=ALL-UNNAMED --add-opens=java.base/java.util.zip=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/sun.nio.fs=ALL-UNNAMED -Xlog:disable sims: OK ✔ in 11.65 seconds build_karaf_tests71: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt build_karaf_tests71: freeze> python -m pip freeze --all build_karaf_tests71: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 build_karaf_tests71: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.nio.charset=ALL-UNNAMED --add-opens=java.base/java.nio.file=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.jar=ALL-UNNAMED --add-opens=java.base/java.util.stream=ALL-UNNAMED --add-opens=java.base/java.util.zip=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/sun.nio.fs=ALL-UNNAMED -Xlog:disable build_karaf_tests121: OK ✔ in 1 minute 4.8 seconds build_karaf_tests_hybrid: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt build_karaf_tests221: OK ✔ in 1 minute 6.1 seconds tests_tapi: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt build_karaf_tests71: OK ✔ in 58.91 seconds tests_tapi: freeze> python -m pip freeze --all build_karaf_tests_hybrid: freeze> python -m pip freeze --all tests_tapi: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 tests_tapi: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh tapi using environment variables from ./karaf221.env pytest -q transportpce_tests/tapi/test01_abstracted_topology.py build_karaf_tests_hybrid: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 build_karaf_tests_hybrid: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.nio.charset=ALL-UNNAMED --add-opens=java.base/java.nio.file=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.jar=ALL-UNNAMED --add-opens=java.base/java.util.stream=ALL-UNNAMED --add-opens=java.base/java.util.zip=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/sun.nio.fs=ALL-UNNAMED -Xlog:disable testsPCE: freeze> python -m pip freeze --all testsPCE: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,click==8.1.7,contourpy==1.3.0,cryptography==3.3.2,cycler==0.12.1,dict2xml==1.7.6,Flask==2.1.3,Flask-Injector==0.14.0,fonttools==4.54.1,gnpy4tpce==2.4.7,idna==3.10,iniconfig==2.0.0,injector==0.22.0,itsdangerous==2.2.0,Jinja2==3.1.4,kiwisolver==1.4.7,lxml==5.3.0,MarkupSafe==3.0.1,matplotlib==3.9.2,netconf-client==3.1.1,networkx==2.8.8,numpy==1.26.4,packaging==24.1,pandas==1.5.3,paramiko==3.5.0,pbr==5.11.1,pillow==11.0.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pyparsing==3.2.0,pytest==8.3.3,python-dateutil==2.9.0.post0,pytz==2024.2,requests==2.32.3,scipy==1.14.1,setuptools==50.3.2,six==1.16.0,urllib3==2.2.3,Werkzeug==2.0.3,wheel==0.44.0,xlrd==1.2.0 testsPCE: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh pce pytest -q transportpce_tests/pce/test01_pce.py ..................FF..F..F...F [100%] 20 passed in 119.94s (0:01:59) pytest -q transportpce_tests/pce/test02_pce_400G.py ..FF...FF..FF.....FF..........F [100%] 9 passed in 41.65s pytest -q transportpce_tests/pce/test03_gnpy.py ...F....FF.... [100%] 8 passed in 37.46s pytest -q transportpce_tests/pce/test04_pce_bug_fix.py .F.FF..F...... [100%] 3 passed in 35.12s build_karaf_tests_hybrid: OK ✔ in 1 minute 14.57 seconds testsPCE: OK ✔ in 5 minutes 24.6 seconds tests121: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt tests121: freeze> python -m pip freeze --all tests121: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 tests121: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 1.2.1 using environment variables from ./karaf121.env pytest -q transportpce_tests/1.2.1/test01_portmapping.py . [100%] =================================== FAILURES =================================== _____________ TransportTapitesting.test_01_get_tapi_topology_T100G _____________ self = def test_01_get_tapi_topology_T100G(self): self.tapi_topo["topology-id"] = test_utils.T100GE_UUID response = test_utils.transportpce_api_rpc_request( 'tapi-topology', 'get-topology-details', self.tapi_topo) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 transportpce_tests/tapi/test01_abstracted_topology.py:190: AssertionError ---------------------------- Captured stdout setup ----------------------------- starting OpenDaylight... starting KARAF TransportPCE build... Searching for pattern 'Transportpce controller started' in karaf.log... Pattern found! OpenDaylight started ! installing tapi feature... installing feature odl-transportpce-tapi client: JAVA_HOME not set; results may vary odl-transportpce-tapi │ 10.0.0.SNAPSHOT │ x │ Started │ odl-transportpce-tapi │ OpenDaylight :: transportpce :: tapi Restarting OpenDaylight... starting KARAF TransportPCE build... Searching for pattern 'Transportpce controller started' in karaf.log... Pattern found! starting simulator xpdra in OpenROADM device version 2.2.1... Searching for pattern 'Data tree change listeners registered' in xpdra-221.log... Pattern found! simulator for xpdra started starting simulator roadma in OpenROADM device version 2.2.1... Searching for pattern 'Data tree change listeners registered' in roadma-221.log... Pattern found! simulator for roadma started starting simulator roadmb in OpenROADM device version 2.2.1... Searching for pattern 'Data tree change listeners registered' in roadmb-221.log... Pattern found! simulator for roadmb started starting simulator roadmc in OpenROADM device version 2.2.1... Searching for pattern 'Data tree change listeners registered' in roadmc-221.log... Pattern found! simulator for roadmc started starting simulator xpdrc in OpenROADM device version 2.2.1... Searching for pattern 'Data tree change listeners registered' in xpdrc-221.log... Pattern found! simulator for xpdrc started starting simulator spdra in OpenROADM device version 2.2.1... Searching for pattern 'Data tree change listeners registered' in spdra-221.log... Pattern found! simulator for spdra started starting simulator spdrc in OpenROADM device version 2.2.1... Searching for pattern 'Data tree change listeners registered' in spdrc-221.log... Pattern found! simulator for spdrc started ---------------------------- Captured stderr setup ----------------------------- SLF4J(W): No SLF4J providers were found. SLF4J(W): Defaulting to no-operation (NOP) logger implementation SLF4J(W): See https://www.slf4j.org/codes.html#noProviders for further details. SLF4J(W): Class path contains SLF4J bindings targeting slf4j-api versions 1.7.x or earlier. SLF4J(W): Ignoring binding found at [jar:file:/w/workspace/transportpce-tox-verify-transportpce-master/karaf221/target/assembly/system/org/apache/karaf/org.apache.karaf.client/4.4.6/org.apache.karaf.client-4.4.6.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J(W): See https://www.slf4j.org/codes.html#ignoredBindings for an explanation. ----------------------------- Captured stdout call ----------------------------- execution of test_01_get_tapi_topology_T100G ______________ TransportTapitesting.test_02_get_tapi_topology_T0 _______________ self = def test_02_get_tapi_topology_T0(self): self.tapi_topo["topology-id"] = test_utils.T0_MULTILAYER_TOPO_UUID response = test_utils.transportpce_api_rpc_request( 'tapi-topology', 'get-topology-details', self.tapi_topo) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 transportpce_tests/tapi/test01_abstracted_topology.py:206: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_02_get_tapi_topology_T0 ________________ TransportTapitesting.test_04_check_tapi_topos _________________ self = def test_04_check_tapi_topos(self): self.tapi_topo["topology-id"] = test_utils.T100GE_UUID response = test_utils.transportpce_api_rpc_request( 'tapi-topology', 'get-topology-details', self.tapi_topo) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 transportpce_tests/tapi/test01_abstracted_topology.py:218: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_04_check_tapi_topos ________________ TransportTapitesting.test_07_check_tapi_topos _________________ self = def test_07_check_tapi_topos(self): self.tapi_topo["topology-id"] = test_utils.T0_MULTILAYER_TOPO_UUID response = test_utils.transportpce_api_rpc_request( 'tapi-topology', 'get-topology-details', self.tapi_topo) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 transportpce_tests/tapi/test01_abstracted_topology.py:241: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_07_check_tapi_topos ________________ TransportTapitesting.test_10_check_tapi_topos _________________ self = def test_10_check_tapi_topos(self): > self.test_01_get_tapi_topology_T100G() transportpce_tests/tapi/test01_abstracted_topology.py:254: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/tapi/test01_abstracted_topology.py:190: in test_01_get_tapi_topology_T100G self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 ----------------------------- Captured stdout call ----------------------------- execution of test_10_check_tapi_topos ____________ TransportTapitesting.test_13_check_tapi_topology_T100G ____________ self = def test_13_check_tapi_topology_T100G(self): self.tapi_topo["topology-id"] = test_utils.T100GE_UUID response = test_utils.transportpce_api_rpc_request( 'tapi-topology', 'get-topology-details', self.tapi_topo) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 transportpce_tests/tapi/test01_abstracted_topology.py:299: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_13_check_tapi_topology_T100G _____________ TransportTapitesting.test_14_check_tapi_topology_T0 ______________ self = def test_14_check_tapi_topology_T0(self): self.tapi_topo["topology-id"] = test_utils.T0_MULTILAYER_TOPO_UUID response = test_utils.transportpce_api_rpc_request( 'tapi-topology', 'get-topology-details', self.tapi_topo) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 transportpce_tests/tapi/test01_abstracted_topology.py:310: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_14_check_tapi_topology_T0 ____________ TransportTapitesting.test_18_check_tapi_topology_T100G ____________ self = def test_18_check_tapi_topology_T100G(self): self.tapi_topo["topology-id"] = test_utils.T100GE_UUID response = test_utils.transportpce_api_rpc_request( 'tapi-topology', 'get-topology-details', self.tapi_topo) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 transportpce_tests/tapi/test01_abstracted_topology.py:350: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_18_check_tapi_topology_T100G _____________ TransportTapitesting.test_19_check_tapi_topology_T0 ______________ self = def test_19_check_tapi_topology_T0(self): self.tapi_topo["topology-id"] = test_utils.T0_MULTILAYER_TOPO_UUID response = test_utils.transportpce_api_rpc_request( 'tapi-topology', 'get-topology-details', self.tapi_topo) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 transportpce_tests/tapi/test01_abstracted_topology.py:364: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_19_check_tapi_topology_T0 ____________ TransportTapitesting.test_22_check_tapi_topology_T100G ____________ self = def test_22_check_tapi_topology_T100G(self): > self.test_18_check_tapi_topology_T100G() transportpce_tests/tapi/test01_abstracted_topology.py:387: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/tapi/test01_abstracted_topology.py:350: in test_18_check_tapi_topology_T100G self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 ----------------------------- Captured stdout call ----------------------------- execution of test_22_check_tapi_topology_T100G _____________ TransportTapitesting.test_23_check_tapi_topology_T0 ______________ self = def test_23_check_tapi_topology_T0(self): > self.test_19_check_tapi_topology_T0() transportpce_tests/tapi/test01_abstracted_topology.py:390: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/tapi/test01_abstracted_topology.py:364: in test_19_check_tapi_topology_T0 self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 ----------------------------- Captured stdout call ----------------------------- execution of test_23_check_tapi_topology_T0 ____________ TransportTapitesting.test_28_check_tapi_topology_T100G ____________ self = def test_28_check_tapi_topology_T100G(self): > self.test_18_check_tapi_topology_T100G() transportpce_tests/tapi/test01_abstracted_topology.py:433: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/tapi/test01_abstracted_topology.py:350: in test_18_check_tapi_topology_T100G self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 ----------------------------- Captured stdout call ----------------------------- execution of test_28_check_tapi_topology_T100G _____________ TransportTapitesting.test_29_check_tapi_topology_T0 ______________ self = def test_29_check_tapi_topology_T0(self): self.tapi_topo["topology-id"] = test_utils.T0_MULTILAYER_TOPO_UUID response = test_utils.transportpce_api_rpc_request( 'tapi-topology', 'get-topology-details', self.tapi_topo) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 transportpce_tests/tapi/test01_abstracted_topology.py:439: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_29_check_tapi_topology_T0 _____________ TransportTapitesting.test_32_check_tapi_topology_T0 ______________ self = def test_32_check_tapi_topology_T0(self): self.tapi_topo["topology-id"] = test_utils.T0_MULTILAYER_TOPO_UUID response = test_utils.transportpce_api_rpc_request( 'tapi-topology', 'get-topology-details', self.tapi_topo) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 transportpce_tests/tapi/test01_abstracted_topology.py:494: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_32_check_tapi_topology_T0 _____________ TransportTapitesting.test_34_check_tapi_topology_T0 ______________ self = def test_34_check_tapi_topology_T0(self): self.tapi_topo["topology-id"] = test_utils.T0_MULTILAYER_TOPO_UUID response = test_utils.transportpce_api_rpc_request( 'tapi-topology', 'get-topology-details', self.tapi_topo) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 transportpce_tests/tapi/test01_abstracted_topology.py:533: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_34_check_tapi_topology_T0 _____________ TransportTapitesting.test_37_check_tapi_topology_T0 ______________ self = def test_37_check_tapi_topology_T0(self): self.tapi_topo["topology-id"] = test_utils.T0_MULTILAYER_TOPO_UUID response = test_utils.transportpce_api_rpc_request( 'tapi-topology', 'get-topology-details', self.tapi_topo) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 transportpce_tests/tapi/test01_abstracted_topology.py:578: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_37_check_tapi_topology_T0 _______________ TransportTapitesting.test_38_delete_ODU4_service _______________ self = def test_38_delete_ODU4_service(self): self.del_serv_input_data["service-delete-req-info"]["service-name"] = "service1-ODU4" response = test_utils.transportpce_api_rpc_request( 'org-openroadm-service', 'service-delete', self.del_serv_input_data) self.assertEqual(response['status_code'], requests.codes.ok) > self.assertIn('Renderer service delete in progress', response['output']['configuration-response-common']['response-message']) E AssertionError: 'Renderer service delete in progress' not found in "Service 'service1-ODU4' does not exist in datastore" transportpce_tests/tapi/test01_abstracted_topology.py:598: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_38_delete_ODU4_service _____________ TransportTapitesting.test_40_check_tapi_topology_T0 ______________ self = def test_40_check_tapi_topology_T0(self): self.tapi_topo["topology-id"] = test_utils.T0_MULTILAYER_TOPO_UUID response = test_utils.transportpce_api_rpc_request( 'tapi-topology', 'get-topology-details', self.tapi_topo) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 transportpce_tests/tapi/test01_abstracted_topology.py:616: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_40_check_tapi_topology_T0 _____________ TransportTapitesting.test_42_check_tapi_topology_T0 ______________ self = def test_42_check_tapi_topology_T0(self): self.tapi_topo["topology-id"] = test_utils.T0_MULTILAYER_TOPO_UUID response = test_utils.transportpce_api_rpc_request( 'tapi-topology', 'get-topology-details', self.tapi_topo) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 transportpce_tests/tapi/test01_abstracted_topology.py:638: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_42_check_tapi_topology_T0 _____________ TransportTapitesting.test_43_get_tapi_topology_T100G _____________ self = def test_43_get_tapi_topology_T100G(self): self.tapi_topo["topology-id"] = test_utils.T100GE_UUID response = test_utils.transportpce_api_rpc_request( 'tapi-topology', 'get-topology-details', self.tapi_topo) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 transportpce_tests/tapi/test01_abstracted_topology.py:652: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_43_get_tapi_topology_T100G ________________ TransportTapitesting.test_46_check_tapi_topos _________________ self = def test_46_check_tapi_topos(self): > self.test_01_get_tapi_topology_T100G() transportpce_tests/tapi/test01_abstracted_topology.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/tapi/test01_abstracted_topology.py:190: in test_01_get_tapi_topology_T100G self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 500 != 200 ----------------------------- Captured stdout call ----------------------------- execution of test_46_check_tapi_topos =========================== short test summary info ============================ FAILED transportpce_tests/tapi/test01_abstracted_topology.py::TransportTapitesting::test_01_get_tapi_topology_T100G FAILED transportpce_tests/tapi/test01_abstracted_topology.py::TransportTapitesting::test_02_get_tapi_topology_T0 FAILED transportpce_tests/tapi/test01_abstracted_topology.py::TransportTapitesting::test_04_check_tapi_topos FAILED transportpce_tests/tapi/test01_abstracted_topology.py::TransportTapitesting::test_07_check_tapi_topos FAILED transportpce_tests/tapi/test01_abstracted_topology.py::TransportTapitesting::test_10_check_tapi_topos FAILED transportpce_tests/tapi/test01_abstracted_topology.py::TransportTapitesting::test_13_check_tapi_topology_T100G FAILED transportpce_tests/tapi/test01_abstracted_topology.py::TransportTapitesting::test_14_check_tapi_topology_T0 FAILED transportpce_tests/tapi/test01_abstracted_topology.py::TransportTapitesting::test_18_check_tapi_topology_T100G FAILED transportpce_tests/tapi/test01_abstracted_topology.py::TransportTapitesting::test_19_check_tapi_topology_T0 FAILED transportpce_tests/tapi/test01_abstracted_topology.py::TransportTapitesting::test_22_check_tapi_topology_T100G FAILED transportpce_tests/tapi/test01_abstracted_topology.py::TransportTapitesting::test_23_check_tapi_topology_T0 FAILED transportpce_tests/tapi/test01_abstracted_topology.py::TransportTapitesting::test_28_check_tapi_topology_T100G FAILED transportpce_tests/tapi/test01_abstracted_topology.py::TransportTapitesting::test_29_check_tapi_topology_T0 FAILED transportpce_tests/tapi/test01_abstracted_topology.py::TransportTapitesting::test_32_check_tapi_topology_T0 FAILED transportpce_tests/tapi/test01_abstracted_topology.py::TransportTapitesting::test_34_check_tapi_topology_T0 FAILED transportpce_tests/tapi/test01_abstracted_topology.py::TransportTapitesting::test_37_check_tapi_topology_T0 FAILED transportpce_tests/tapi/test01_abstracted_topology.py::TransportTapitesting::test_38_delete_ODU4_service FAILED transportpce_tests/tapi/test01_abstracted_topology.py::TransportTapitesting::test_40_check_tapi_topology_T0 FAILED transportpce_tests/tapi/test01_abstracted_topology.py::TransportTapitesting::test_42_check_tapi_topology_T0 FAILED transportpce_tests/tapi/test01_abstracted_topology.py::TransportTapitesting::test_43_get_tapi_topology_T100G FAILED transportpce_tests/tapi/test01_abstracted_topology.py::TransportTapitesting::test_46_check_tapi_topos 21 failed, 29 passed in 394.43s (0:06:34) tests_tapi: exit 1 (394.68 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh tapi pid=30757 tests_tapi: FAIL ✖ in 6 minutes 53.56 seconds tests71: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt tests71: freeze> python -m pip freeze --all tests71: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 tests71: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 7.1 using environment variables from ./karaf71.env pytest -q transportpce_tests/7.1/test01_portmapping.py ............ [100%] 12 passed in 43.03s pytest -q transportpce_tests/7.1/test02_otn_renderer.py .FFFFF.............................................................. [100%] 62 passed in 164.50s (0:02:44) pytest -q transportpce_tests/7.1/test03_renderer_or_modes.py ....FF.FFFF.FF........................................... [100%] 48 passed in 134.79s (0:02:14) pytest -q transportpce_tests/7.1/test04_renderer_regen_mode.py ...................... [100%] 22 passed in 75.91s (0:01:15) tests71: OK ✔ in 7 minutes 5.5 seconds tests221: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt tests221: freeze> python -m pip freeze --all tests221: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 tests221: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 2.2.1 using environment variables from ./karaf221.env pytest -q transportpce_tests/2.2.1/test01_portmapping.py FFFFFF [100%] =================================== FAILURES =================================== _________ TransportPCEPortMappingTesting.test_02_rdm_device_connected __________ self = def test_02_rdm_device_connected(self): response = test_utils.check_device_connection("ROADMA01") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/1.2.1/test01_portmapping.py:54: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_02_rdm_device_connected _________ TransportPCEPortMappingTesting.test_03_rdm_portmapping_info __________ self = def test_03_rdm_portmapping_info(self): response = test_utils.get_portmapping_node_attr("ROADMA01", "node-info", None) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/1.2.1/test01_portmapping.py:60: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_03_rdm_portmapping_info _____ TransportPCEPortMappingTesting.test_04_rdm_portmapping_DEG1_TTP_TXRX _____ self = def test_04_rdm_portmapping_DEG1_TTP_TXRX(self): response = test_utils.get_portmapping_node_attr("ROADMA01", "mapping", "DEG1-TTP-TXRX") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/1.2.1/test01_portmapping.py:73: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_04_rdm_portmapping_DEG1_TTP_TXRX _____ TransportPCEPortMappingTesting.test_05_rdm_portmapping_SRG1_PP7_TXRX _____ self = def test_05_rdm_portmapping_SRG1_PP7_TXRX(self): response = test_utils.get_portmapping_node_attr("ROADMA01", "mapping", "SRG1-PP7-TXRX") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/1.2.1/test01_portmapping.py:82: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_05_rdm_portmapping_SRG1_PP7_TXRX _____ TransportPCEPortMappingTesting.test_06_rdm_portmapping_SRG3_PP1_TXRX _____ self = def test_06_rdm_portmapping_SRG3_PP1_TXRX(self): response = test_utils.get_portmapping_node_attr("ROADMA01", "mapping", "SRG3-PP1-TXRX") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/1.2.1/test01_portmapping.py:91: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_06_rdm_portmapping_SRG3_PP1_TXRX _________ TransportPCEPortMappingTesting.test_08_xpdr_device_connected _________ self = def test_08_xpdr_device_connected(self): response = test_utils.check_device_connection("XPDRA01") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/1.2.1/test01_portmapping.py:104: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_08_xpdr_device_connected _________ TransportPCEPortMappingTesting.test_09_xpdr_portmapping_info _________ self = def test_09_xpdr_portmapping_info(self): response = test_utils.get_portmapping_node_attr("XPDRA01", "node-info", None) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/1.2.1/test01_portmapping.py:110: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_09_xpdr_portmapping_info _______ TransportPCEPortMappingTesting.test_10_xpdr_portmapping_NETWORK1 _______ self = def test_10_xpdr_portmapping_NETWORK1(self): response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-NETWORK1") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/1.2.1/test01_portmapping.py:123: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_10_xpdr_portmapping_NETWORK1 _______ TransportPCEPortMappingTesting.test_11_xpdr_portmapping_NETWORK2 _______ self = def test_11_xpdr_portmapping_NETWORK2(self): response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-NETWORK2") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/1.2.1/test01_portmapping.py:134: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_11_xpdr_portmapping_NETWORK2 _______ TransportPCEPortMappingTesting.test_12_xpdr_portmapping_CLIENT1 ________ self = def test_12_xpdr_portmapping_CLIENT1(self): response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-CLIENT1") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/1.2.1/test01_portmapping.py:145: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_12_xpdr_portmapping_CLIENT1 _______ TransportPCEPortMappingTesting.test_13_xpdr_portmapping_CLIENT2 ________ self = def test_13_xpdr_portmapping_CLIENT2(self): response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-CLIENT2") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/1.2.1/test01_portmapping.py:157: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_13_xpdr_portmapping_CLIENT2 _______ TransportPCEPortMappingTesting.test_14_xpdr_portmapping_CLIENT3 ________ self = def test_14_xpdr_portmapping_CLIENT3(self): response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-CLIENT3") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/1.2.1/test01_portmapping.py:169: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_14_xpdr_portmapping_CLIENT3 _______ TransportPCEPortMappingTesting.test_15_xpdr_portmapping_CLIENT4 ________ self = def test_15_xpdr_portmapping_CLIENT4(self): response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-CLIENT4") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/1.2.1/test01_portmapping.py:181: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_15_xpdr_portmapping_CLIENT4 _______ TransportPCEPortMappingTesting.test_16_xpdr_device_disconnection _______ self = def test_16_xpdr_device_disconnection(self): response = test_utils.unmount_device("XPDRA01") > self.assertIn(response.status_code, (requests.codes.ok, requests.codes.no_content)) E AssertionError: 409 not found in (200, 204) transportpce_tests/1.2.1/test01_portmapping.py:192: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_16_xpdr_device_disconnection Searching for pattern 'onDeviceDisConnected:\ XPDRA01' in karaf.log... Pattern not found after 180 seconds! Node XPDRA01 still not deleted from tpce topology... _______ TransportPCEPortMappingTesting.test_17_xpdr_device_disconnected ________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'GET' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01?content=nonconfig' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01', query='content=nonconfig', fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01?content=nonconfig' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_17_xpdr_device_disconnected(self): > response = test_utils.check_device_connection("XPDRA01") transportpce_tests/1.2.1/test01_portmapping.py:195: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:369: in check_device_connection response = get_request(url[RESTCONF_VERSION].format('{}', node)) transportpce_tests/common/test_utils.py:116: in get_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_17_xpdr_device_disconnected _______ TransportPCEPortMappingTesting.test_18_xpdr_device_not_connected _______ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_18_xpdr_device_not_connected(self): > response = test_utils.get_portmapping_node_attr("XPDRA01", "node-info", None) transportpce_tests/1.2.1/test01_portmapping.py:203: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:470: in get_portmapping_node_attr response = get_request(target_url) transportpce_tests/common/test_utils.py:116: in get_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_18_xpdr_device_not_connected _______ TransportPCEPortMappingTesting.test_19_rdm_device_disconnection ________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'DELETE' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '0', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'DELETE' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_19_rdm_device_disconnection(self): > response = test_utils.unmount_device("ROADMA01") transportpce_tests/1.2.1/test01_portmapping.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:358: in unmount_device response = delete_request(url[RESTCONF_VERSION].format('{}', node)) transportpce_tests/common/test_utils.py:133: in delete_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01 (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_19_rdm_device_disconnection ________ TransportPCEPortMappingTesting.test_20_rdm_device_disconnected ________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'GET' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01?content=nonconfig' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01', query='content=nonconfig', fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET' url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01?content=nonconfig' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_20_rdm_device_disconnected(self): > response = test_utils.check_device_connection("ROADMA01") transportpce_tests/1.2.1/test01_portmapping.py:215: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:369: in check_device_connection response = get_request(url[RESTCONF_VERSION].format('{}', node)) transportpce_tests/common/test_utils.py:116: in get_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01?content=nonconfig (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_20_rdm_device_disconnected _______ TransportPCEPortMappingTesting.test_21_rdm_device_not_connected ________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8182), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=ROADMA01/node-info' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=ROADMA01/node-info', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:441: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:279: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:214: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET' url = '/rests/data/transportpce-portmapping:network/nodes=ROADMA01/node-info' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=ROADMA01/node-info (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_21_rdm_device_not_connected(self): > response = test_utils.get_portmapping_node_attr("ROADMA01", "node-info", None) transportpce_tests/1.2.1/test01_portmapping.py:223: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:470: in get_portmapping_node_attr response = get_request(target_url) transportpce_tests/common/test_utils.py:116: in get_request return requests.request( ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8182): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=ROADMA01/node-info (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ----------------------------- Captured stdout call ----------------------------- execution of test_21_rdm_device_not_connected --------------------------- Captured stdout teardown --------------------------- all processes killed =========================== short test summary info ============================ FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_02_rdm_device_connected FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_03_rdm_portmapping_info FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_04_rdm_portmapping_DEG1_TTP_TXRX FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_05_rdm_portmapping_SRG1_PP7_TXRX FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_06_rdm_portmapping_SRG3_PP1_TXRX FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_08_xpdr_device_connected FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_09_xpdr_portmapping_info FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_10_xpdr_portmapping_NETWORK1 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_11_xpdr_portmapping_NETWORK2 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_12_xpdr_portmapping_CLIENT1 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_13_xpdr_portmapping_CLIENT2 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_14_xpdr_portmapping_CLIENT3 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_15_xpdr_portmapping_CLIENT4 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_16_xpdr_device_disconnection FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_17_xpdr_device_disconnected FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_18_xpdr_device_not_connected FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_19_rdm_device_disconnection FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_20_rdm_device_disconnected FAILED transportpce_tests/1.2.1/test01_portmapping.py::TransportPCEPortMappingTesting::test_21_rdm_device_not_connected 19 failed, 2 passed in 595.66s (0:09:55) tests121: exit 1 (596.01 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 1.2.1 pid=35410 ................................... [100%] 35 passed in 80.27s (0:01:20) pytest -q transportpce_tests/2.2.1/test02_topo_portmapping.py ...... [100%] 6 passed in 46.73s pytest -q transportpce_tests/2.2.1/test03_topology.py ............................................ [100%] 44 passed in 319.95s (0:05:19) pytest -q transportpce_tests/2.2.1/test04_otn_topology.py ............ [100%] 12 passed in 60.35s (0:01:00) pytest -q transportpce_tests/2.2.1/test05_flex_grid.py ................ [100%] 16 passed in 115.76s (0:01:55) pytest -q transportpce_tests/2.2.1/test06_renderer_service_path_nominal.py ............................... [100%] 31 passed in 37.46s pytest -q transportpce_tests/2.2.1/test07_otn_renderer.py .......................... [100%] 26 passed in 92.30s (0:01:32) pytest -q transportpce_tests/2.2.1/test08_otn_sh_renderer.py ...................... [100%] 22 passed in 101.00s (0:01:40) pytest -q transportpce_tests/2.2.1/test09_olm.py ........................................ [100%] 40 passed in 363.32s (0:06:03) pytest -q transportpce_tests/2.2.1/test11_otn_end2end.py .......................FFFFF.FFFFFFFFFFF.F....FF....................F.FF [ 74%] FFFFFFFFF.F....FF........ [100%] =================================== FAILURES =================================== ________________ TransportPCEtesting.test_24_get_ODU4_service1 _________________ self = def test_24_get_ODU4_service1(self): response = test_utils.get_ordm_serv_list_attr_request( "services", "service1-ODU4") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:444: AssertionError ____________ TransportPCEtesting.test_25_check_interface_ODU4_spdra ____________ self = def test_25_check_interface_ODU4_spdra(self): response = test_utils.check_node_attribute_request( 'SPDR-SA1', 'interface', 'XPDR1-NETWORK1-ODU4') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:457: AssertionError ____________ TransportPCEtesting.test_26_check_interface_ODU4_spdrc ____________ self = def test_26_check_interface_ODU4_spdrc(self): response = test_utils.check_node_attribute_request( 'SPDR-SC1', 'interface', 'XPDR1-NETWORK1-ODU4') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:488: AssertionError _______________ TransportPCEtesting.test_27_check_otn_topo_links _______________ self = def test_27_check_otn_topo_links(self): response = test_utils.get_ietf_network_request('otn-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) > self.assertEqual(len(response['network'][0]['ietf-network-topology:link']), 4) E AssertionError: 2 != 4 transportpce_tests/2.2.1/test11_otn_end2end.py:519: AssertionError ________________ TransportPCEtesting.test_28_check_otn_topo_tp _________________ self = def test_28_check_otn_topo_tp(self): response = test_utils.get_ietf_network_request('otn-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) for node in response['network'][0]['node']: if node['node-id'] in ('SPDR-SA1-XPDR1', 'SPDR-SC1-XPDR1'): tpList = node['ietf-network-topology:termination-point'] for tp in tpList: if tp['tp-id'] == 'XPDR1-NETWORK1': xpdrTpPortConAt = tp['org-openroadm-otn-network-topology:xpdr-tp-port-connection-attributes'] > self.assertEqual(len(xpdrTpPortConAt['ts-pool']), 80) E KeyError: 'ts-pool' transportpce_tests/2.2.1/test11_otn_end2end.py:553: KeyError ________________ TransportPCEtesting.test_30_get_10GE_service1 _________________ self = def test_30_get_10GE_service1(self): response = test_utils.get_ordm_serv_list_attr_request( "services", "service1-10GE") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:584: AssertionError ________ TransportPCEtesting.test_31_check_interface_10GE_CLIENT_spdra _________ self = def test_31_check_interface_10GE_CLIENT_spdra(self): response = test_utils.check_node_attribute_request( 'SPDR-SA1', 'interface', 'XPDR1-CLIENT1-ETHERNET10G') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:597: AssertionError ________ TransportPCEtesting.test_32_check_interface_ODU2E_CLIENT_spdra ________ self = def test_32_check_interface_ODU2E_CLIENT_spdra(self): response = test_utils.check_node_attribute_request( 'SPDR-SA1', 'interface', 'XPDR1-CLIENT1-ODU2e:service1-10GE') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:613: AssertionError _______ TransportPCEtesting.test_33_check_interface_ODU2E_NETWORK_spdra ________ self = def test_33_check_interface_ODU2E_NETWORK_spdra(self): response = test_utils.check_node_attribute_request( 'SPDR-SA1', 'interface', 'XPDR1-NETWORK1-ODU2e:service1-10GE') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:637: AssertionError ___________ TransportPCEtesting.test_34_check_ODU2E_connection_spdra ___________ self = def test_34_check_ODU2E_connection_spdra(self): response = test_utils.check_node_attribute_request( 'SPDR-SA1', 'odu-connection', 'XPDR1-CLIENT1-ODU2e-x-XPDR1-NETWORK1-ODU2e') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:665: AssertionError ________ TransportPCEtesting.test_35_check_interface_10GE_CLIENT_spdrc _________ self = def test_35_check_interface_10GE_CLIENT_spdrc(self): response = test_utils.check_node_attribute_request( 'SPDR-SC1', 'interface', 'XPDR1-CLIENT1-ETHERNET10G') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:682: AssertionError ________ TransportPCEtesting.test_36_check_interface_ODU2E_CLIENT_spdrc ________ self = def test_36_check_interface_ODU2E_CLIENT_spdrc(self): response = test_utils.check_node_attribute_request( 'SPDR-SC1', 'interface', 'XPDR1-CLIENT1-ODU2e:service1-10GE') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:698: AssertionError _______ TransportPCEtesting.test_37_check_interface_ODU2E_NETWORK_spdrc ________ self = def test_37_check_interface_ODU2E_NETWORK_spdrc(self): response = test_utils.check_node_attribute_request( 'SPDR-SC1', 'interface', 'XPDR1-NETWORK1-ODU2e:service1-10GE') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:722: AssertionError ___________ TransportPCEtesting.test_38_check_ODU2E_connection_spdrc ___________ self = def test_38_check_ODU2E_connection_spdrc(self): response = test_utils.check_node_attribute_request( 'SPDR-SC1', 'odu-connection', 'XPDR1-CLIENT1-ODU2e-x-XPDR1-NETWORK1-ODU2e') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:754: AssertionError _______________ TransportPCEtesting.test_39_check_otn_topo_links _______________ self = def test_39_check_otn_topo_links(self): response = test_utils.get_ietf_network_request('otn-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) > self.assertEqual(len(response['network'][0]['ietf-network-topology:link']), 4) E AssertionError: 2 != 4 transportpce_tests/2.2.1/test11_otn_end2end.py:771: AssertionError ________________ TransportPCEtesting.test_40_check_otn_topo_tp _________________ self = def test_40_check_otn_topo_tp(self): response = test_utils.get_ietf_network_request('otn-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) for node in response['network'][0]['node']: if node['node-id'] in ('SPDR-SA1-XPDR1', 'SPDR-SC1-XPDR1'): tpList = node['ietf-network-topology:termination-point'] for tp in tpList: if tp['tp-id'] == 'XPDR1-NETWORK1': xpdrTpPortConAt = tp['org-openroadm-otn-network-topology:xpdr-tp-port-connection-attributes'] > self.assertEqual(len(xpdrTpPortConAt['ts-pool']), 72) E KeyError: 'ts-pool' transportpce_tests/2.2.1/test11_otn_end2end.py:790: KeyError ________________ TransportPCEtesting.test_42_check_service_list ________________ self = def test_42_check_service_list(self): response = test_utils.get_ordm_serv_list_request() self.assertEqual(response['status_code'], requests.codes.ok) > self.assertEqual(len(response['service-list']['services']), 2) E AssertionError: 1 != 2 transportpce_tests/2.2.1/test11_otn_end2end.py:810: AssertionError _______________ TransportPCEtesting.test_47_check_otn_topo_links _______________ self = def test_47_check_otn_topo_links(self): response = test_utils.get_ietf_network_request('otn-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) > self.assertEqual(len(response['network'][0]['ietf-network-topology:link']), 4) E AssertionError: 2 != 4 transportpce_tests/2.2.1/test11_otn_end2end.py:835: AssertionError ________________ TransportPCEtesting.test_48_check_otn_topo_tp _________________ self = def test_48_check_otn_topo_tp(self): response = test_utils.get_ietf_network_request('otn-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) for node in response['network'][0]['node']: if node['node-id'] in ('SPDR-SA1-XPDR1', 'SPDR-SC1-XPDR1'): tpList = node['ietf-network-topology:termination-point'] for tp in tpList: if tp['tp-id'] == 'XPDR1-NETWORK1': xpdrTpPortConAt = tp['org-openroadm-otn-network-topology:xpdr-tp-port-connection-attributes'] > self.assertEqual(len(xpdrTpPortConAt['ts-pool']), 80) E KeyError: 'ts-pool' transportpce_tests/2.2.1/test11_otn_end2end.py:854: KeyError ________________ TransportPCEtesting.test_69_get_ODU4_service2 _________________ self = def test_69_get_ODU4_service2(self): response = test_utils.get_ordm_serv_list_attr_request( "services", "service2-ODU4") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:1055: AssertionError _________________ TransportPCEtesting.test_71_get_1GE_service1 _________________ self = def test_71_get_1GE_service1(self): response = test_utils.get_ordm_serv_list_attr_request( "services", "service1-1GE") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:1092: AssertionError _________ TransportPCEtesting.test_72_check_interface_1GE_CLIENT_spdra _________ self = def test_72_check_interface_1GE_CLIENT_spdra(self): response = test_utils.check_node_attribute_request( 'SPDR-SA1', 'interface', 'XPDR3-CLIENT1-ETHERNET1G') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:1105: AssertionError ________ TransportPCEtesting.test_73_check_interface_ODU0_CLIENT_spdra _________ self = def test_73_check_interface_ODU0_CLIENT_spdra(self): response = test_utils.check_node_attribute_request( 'SPDR-SA1', 'interface', 'XPDR3-CLIENT1-ODU0:service1-1GE') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:1121: AssertionError ________ TransportPCEtesting.test_74_check_interface_ODU0_NETWORK_spdra ________ self = def test_74_check_interface_ODU0_NETWORK_spdra(self): response = test_utils.check_node_attribute_request( 'SPDR-SA1', 'interface', 'XPDR3-NETWORK1-ODU0:service1-1GE') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:1144: AssertionError ___________ TransportPCEtesting.test_75_check_ODU0_connection_spdra ____________ self = def test_75_check_ODU0_connection_spdra(self): response = test_utils.check_node_attribute_request( 'SPDR-SA1', 'odu-connection', 'XPDR3-CLIENT1-ODU0-x-XPDR3-NETWORK1-ODU0') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:1171: AssertionError _________ TransportPCEtesting.test_76_check_interface_1GE_CLIENT_spdrc _________ self = def test_76_check_interface_1GE_CLIENT_spdrc(self): response = test_utils.check_node_attribute_request( 'SPDR-SC1', 'interface', 'XPDR3-CLIENT1-ETHERNET1G') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:1187: AssertionError ________ TransportPCEtesting.test_77_check_interface_ODU0_CLIENT_spdrc _________ self = def test_77_check_interface_ODU0_CLIENT_spdrc(self): response = test_utils.check_node_attribute_request( 'SPDR-SC1', 'interface', 'XPDR3-CLIENT1-ODU0:service1-1GE') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:1203: AssertionError ________ TransportPCEtesting.test_78_check_interface_ODU0_NETWORK_spdrc ________ self = def test_78_check_interface_ODU0_NETWORK_spdrc(self): response = test_utils.check_node_attribute_request( 'SPDR-SC1', 'interface', 'XPDR3-NETWORK1-ODU0:service1-1GE') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:1226: AssertionError ___________ TransportPCEtesting.test_79_check_ODU0_connection_spdrc ____________ self = def test_79_check_ODU0_connection_spdrc(self): response = test_utils.check_node_attribute_request( 'SPDR-SC1', 'odu-connection', 'XPDR3-CLIENT1-ODU0-x-XPDR3-NETWORK1-ODU0') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test11_otn_end2end.py:1256: AssertionError _______________ TransportPCEtesting.test_80_check_otn_topo_links _______________ self = def test_80_check_otn_topo_links(self): response = test_utils.get_ietf_network_request('otn-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) > self.assertEqual(len(response['network'][0]['ietf-network-topology:link']), 4) E AssertionError: 2 != 4 transportpce_tests/2.2.1/test11_otn_end2end.py:1272: AssertionError ________________ TransportPCEtesting.test_81_check_otn_topo_tp _________________ self = def test_81_check_otn_topo_tp(self): response = test_utils.get_ietf_network_request('otn-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) for node in response['network'][0]['node']: if node['node-id'] in ('SPDR-SA1-XPDR3', 'SPDR-SC1-XPDR3'): tpList = node['ietf-network-topology:termination-point'] for tp in tpList: if tp['tp-id'] == 'XPDR3-NETWORK1': xpdrTpPortConAt = tp['org-openroadm-otn-network-topology:xpdr-tp-port-connection-attributes'] > self.assertEqual(len(xpdrTpPortConAt['ts-pool']), 79) E KeyError: 'ts-pool' transportpce_tests/2.2.1/test11_otn_end2end.py:1291: KeyError ________________ TransportPCEtesting.test_83_check_service_list ________________ self = def test_83_check_service_list(self): response = test_utils.get_ordm_serv_list_request() self.assertEqual(response['status_code'], requests.codes.ok) > self.assertEqual(len(response['service-list']['services']), 2) E AssertionError: 1 != 2 transportpce_tests/2.2.1/test11_otn_end2end.py:1311: AssertionError _______________ TransportPCEtesting.test_88_check_otn_topo_links _______________ self = def test_88_check_otn_topo_links(self): response = test_utils.get_ietf_network_request('otn-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) > self.assertEqual(len(response['network'][0]['ietf-network-topology:link']), 4) E AssertionError: 2 != 4 transportpce_tests/2.2.1/test11_otn_end2end.py:1336: AssertionError ________________ TransportPCEtesting.test_89_check_otn_topo_tp _________________ self = def test_89_check_otn_topo_tp(self): response = test_utils.get_ietf_network_request('otn-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) for node in response['network'][0]['node']: if node['node-id'] in ('SPDR-SA1-XPDR3', 'SPDR-SC1-XPDR3'): tpList = node['ietf-network-topology:termination-point'] for tp in tpList: if tp['tp-id'] == 'XPDR3-NETWORK1': xpdrTpPortConAt = tp['org-openroadm-otn-network-topology:xpdr-tp-port-connection-attributes'] > self.assertEqual(len(xpdrTpPortConAt['ts-pool']), 80) E KeyError: 'ts-pool' transportpce_tests/2.2.1/test11_otn_end2end.py:1355: KeyError =========================== short test summary info ============================ FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_24_get_ODU4_service1 FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_25_check_interface_ODU4_spdra FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_26_check_interface_ODU4_spdrc FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_27_check_otn_topo_links FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_28_check_otn_topo_tp FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_30_get_10GE_service1 FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_31_check_interface_10GE_CLIENT_spdra FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_32_check_interface_ODU2E_CLIENT_spdra FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_33_check_interface_ODU2E_NETWORK_spdra FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_34_check_ODU2E_connection_spdra FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_35_check_interface_10GE_CLIENT_spdrc FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_36_check_interface_ODU2E_CLIENT_spdrc FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_37_check_interface_ODU2E_NETWORK_spdrc FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_38_check_ODU2E_connection_spdrc FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_39_check_otn_topo_links FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_40_check_otn_topo_tp FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_42_check_service_list FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_47_check_otn_topo_links FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_48_check_otn_topo_tp FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_69_get_ODU4_service2 FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_71_get_1GE_service1 FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_72_check_interface_1GE_CLIENT_spdra FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_73_check_interface_ODU0_CLIENT_spdra FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_74_check_interface_ODU0_NETWORK_spdra FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_75_check_ODU0_connection_spdra FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_76_check_interface_1GE_CLIENT_spdrc FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_77_check_interface_ODU0_CLIENT_spdrc FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_78_check_interface_ODU0_NETWORK_spdrc FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_79_check_ODU0_connection_spdrc FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_80_check_otn_topo_links FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_81_check_otn_topo_tp FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_83_check_service_list FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_88_check_otn_topo_links FAILED transportpce_tests/2.2.1/test11_otn_end2end.py::TransportPCEtesting::test_89_check_otn_topo_tp 34 failed, 63 passed in 492.49s (0:08:12) tests121: FAIL ✖ in 10 minutes 3.04 seconds tests221: exit 1 (1712.00 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 2.2.1 pid=39186 tests221: FAIL ✖ in 28 minutes 38.98 seconds tests_hybrid: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt tests_hybrid: freeze> python -m pip freeze --all tests_hybrid: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 tests_hybrid: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh hybrid using environment variables from ./karaf121.env pytest -q transportpce_tests/hybrid/test01_device_change_notifications.py ................................................... [100%] 51 passed in 511.79s (0:08:31) pytest -q transportpce_tests/hybrid/test02_B100G_end2end.py .......................FFFFF.FFFFFFFFFFF.FFFFFFF....FF.FFFFFFFFFFFFFF... [ 66%] ..................................... [100%] =================================== FAILURES =================================== _______________ TransportPCEtesting.test_024_get_ODUC4_service1 ________________ self = def test_024_get_ODUC4_service1(self): response = test_utils.get_ordm_serv_list_attr_request( "services", "service1-ODUC4") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:456: AssertionError __________ TransportPCEtesting.test_025_check_interface_ODUC4_xpdra2 ___________ self = def test_025_check_interface_ODUC4_xpdra2(self): response = test_utils.check_node_attribute_request( 'XPDR-A2', 'interface', 'XPDR2-NETWORK1-ODUC4') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:471: AssertionError __________ TransportPCEtesting.test_026_check_interface_ODUC4_xpdrc2 ___________ self = def test_026_check_interface_ODUC4_xpdrc2(self): response = test_utils.check_node_attribute_request( 'XPDR-C2', 'interface', 'XPDR2-NETWORK1-ODUC4') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:497: AssertionError ______________ TransportPCEtesting.test_027_check_otn_topo_links _______________ self = def test_027_check_otn_topo_links(self): response = test_utils.get_ietf_network_request('otn-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) > self.assertEqual(len(response['network'][0]['ietf-network-topology:link']), 4) E AssertionError: 2 != 4 transportpce_tests/hybrid/test02_B100G_end2end.py:523: AssertionError ________________ TransportPCEtesting.test_028_check_otn_topo_tp ________________ self = def test_028_check_otn_topo_tp(self): response = test_utils.get_ietf_network_request('otn-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) for node in response['network'][0]['node']: if node['node-id'] in ('XPDR-A2-XPDR2', 'XPDR-C2-XPDR2'): tpList = node['ietf-network-topology:termination-point'] for tp in tpList: if tp['tp-id'] == 'XPDR2-NETWORK1': > xpdrTpPortConAt = tp['org-openroadm-otn-network-topology:xpdr-tp-port-connection-attributes'] E KeyError: 'org-openroadm-otn-network-topology:xpdr-tp-port-connection-attributes' transportpce_tests/hybrid/test02_B100G_end2end.py:556: KeyError _______________ TransportPCEtesting.test_030_get_100GE_service_1 _______________ self = def test_030_get_100GE_service_1(self): response = test_utils.get_ordm_serv_list_attr_request( "services", "service-100GE") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:588: AssertionError _______ TransportPCEtesting.test_031_check_interface_100GE_CLIENT_xpdra2 _______ self = def test_031_check_interface_100GE_CLIENT_xpdra2(self): response = test_utils.check_node_attribute_request( 'XPDR-A2', 'interface', 'XPDR2-CLIENT1-ETHERNET-100G') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:597: AssertionError _______ TransportPCEtesting.test_032_check_interface_ODU4_CLIENT_xpdra2 ________ self = def test_032_check_interface_ODU4_CLIENT_xpdra2(self): response = test_utils.check_node_attribute_request( 'XPDR-A2', 'interface', 'XPDR2-CLIENT1-ODU4:service-100GE') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:614: AssertionError _______ TransportPCEtesting.test_033_check_interface_ODU4_NETWORK_xpdra2 _______ self = def test_033_check_interface_ODU4_NETWORK_xpdra2(self): response = test_utils.check_node_attribute_request( 'XPDR-A2', 'interface', 'XPDR2-NETWORK1-ODU4:service-100GE') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:638: AssertionError __________ TransportPCEtesting.test_034_check_ODU4_connection_xpdra2 ___________ self = def test_034_check_ODU4_connection_xpdra2(self): response = test_utils.check_node_attribute_request( 'XPDR-A2', 'odu-connection', 'XPDR2-CLIENT1-ODU4-x-XPDR2-NETWORK1-ODU4') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:668: AssertionError _______ TransportPCEtesting.test_035_check_interface_100GE_CLIENT_xpdrc2 _______ self = def test_035_check_interface_100GE_CLIENT_xpdrc2(self): response = test_utils.check_node_attribute_request( 'XPDR-C2', 'interface', 'XPDR2-CLIENT1-ETHERNET-100G') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:685: AssertionError _______ TransportPCEtesting.test_036_check_interface_ODU4_CLIENT_xpdrc2 ________ self = def test_036_check_interface_ODU4_CLIENT_xpdrc2(self): response = test_utils.check_node_attribute_request( 'XPDR-C2', 'interface', 'XPDR2-CLIENT1-ODU4:service-100GE') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:702: AssertionError _______ TransportPCEtesting.test_037_check_interface_ODU4_NETWORK_xpdrc2 _______ self = def test_037_check_interface_ODU4_NETWORK_xpdrc2(self): response = test_utils.check_node_attribute_request( 'XPDR-C2', 'interface', 'XPDR2-NETWORK1-ODU4:service-100GE') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:726: AssertionError __________ TransportPCEtesting.test_038_check_ODU4_connection_xpdrc2 ___________ self = def test_038_check_ODU4_connection_xpdrc2(self): response = test_utils.check_node_attribute_request( 'XPDR-C2', 'odu-connection', 'XPDR2-CLIENT1-ODU4-x-XPDR2-NETWORK1-ODU4') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:762: AssertionError ______________ TransportPCEtesting.test_039_check_otn_topo_links _______________ self = def test_039_check_otn_topo_links(self): response = test_utils.get_ietf_network_request('otn-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) > self.assertEqual(len(response['network'][0]['ietf-network-topology:link']), 4) E AssertionError: 2 != 4 transportpce_tests/hybrid/test02_B100G_end2end.py:779: AssertionError ________________ TransportPCEtesting.test_040_check_otn_topo_tp ________________ self = def test_040_check_otn_topo_tp(self): response = test_utils.get_ietf_network_request('otn-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) for node in response['network'][0]['node']: if node['node-id'] in ('XPDR-A2-XPDR2', 'XPDR-C2-XPDR2'): tpList = node['ietf-network-topology:termination-point'] for tp in tpList: if tp['tp-id'] == 'XPDR2-NETWORK1': > xpdrTpPortConAt = tp['org-openroadm-otn-network-topology:xpdr-tp-port-connection-attributes'] E KeyError: 'org-openroadm-otn-network-topology:xpdr-tp-port-connection-attributes' transportpce_tests/hybrid/test02_B100G_end2end.py:797: KeyError _______________ TransportPCEtesting.test_042_get_100GE_service_2 _______________ self = def test_042_get_100GE_service_2(self): response = test_utils.get_ordm_serv_list_attr_request("services", "service-100GE2") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:829: AssertionError _______________ TransportPCEtesting.test_043_check_service_list ________________ self = def test_043_check_service_list(self): response = test_utils.get_ordm_serv_list_request() self.assertEqual(response['status_code'], requests.codes.ok) > self.assertEqual(len(response['service-list']['services']), 4) E AssertionError: 1 != 4 transportpce_tests/hybrid/test02_B100G_end2end.py:839: AssertionError ______________ TransportPCEtesting.test_044_check_otn_topo_links _______________ self = def test_044_check_otn_topo_links(self): response = test_utils.get_ietf_network_request('otn-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) > self.assertEqual(len(response['network'][0]['ietf-network-topology:link']), 4) E AssertionError: 2 != 4 transportpce_tests/hybrid/test02_B100G_end2end.py:844: AssertionError ________________ TransportPCEtesting.test_045_check_otn_topo_tp ________________ self = def test_045_check_otn_topo_tp(self): response = test_utils.get_ietf_network_request('otn-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) for node in response['network'][0]['node']: if node['node-id'] in ('XPDR-A2-XPDR2', 'XPDR-C2-XPDR2'): tpList = node['ietf-network-topology:termination-point'] for tp in tpList: if tp['tp-id'] == 'XPDR2-NETWORK1': > xpdrTpPortConAt = tp['org-openroadm-otn-network-topology:xpdr-tp-port-connection-attributes'] E KeyError: 'org-openroadm-otn-network-topology:xpdr-tp-port-connection-attributes' transportpce_tests/hybrid/test02_B100G_end2end.py:862: KeyError _____________ TransportPCEtesting.test_046_delete_100GE_service_2 ______________ self = def test_046_delete_100GE_service_2(self): self.del_serv_input_data["service-delete-req-info"]["service-name"] = "service-100GE2" response = test_utils.transportpce_api_rpc_request( 'org-openroadm-service', 'service-delete', self.del_serv_input_data) self.assertEqual(response['status_code'], requests.codes.ok) > self.assertIn('Renderer service delete in progress', response['output']['configuration-response-common']['response-message']) E AssertionError: 'Renderer service delete in progress' not found in "Service 'service-100GE2' does not exist in datastore" transportpce_tests/hybrid/test02_B100G_end2end.py:878: AssertionError _____________ TransportPCEtesting.test_047_delete_100GE_service_1 ______________ self = def test_047_delete_100GE_service_1(self): self.del_serv_input_data["service-delete-req-info"]["service-name"] = "service-100GE" response = test_utils.transportpce_api_rpc_request( 'org-openroadm-service', 'service-delete', self.del_serv_input_data) self.assertEqual(response['status_code'], requests.codes.ok) > self.assertIn('Renderer service delete in progress', response['output']['configuration-response-common']['response-message']) E AssertionError: 'Renderer service delete in progress' not found in "Service 'service-100GE' does not exist in datastore" transportpce_tests/hybrid/test02_B100G_end2end.py:888: AssertionError _______________ TransportPCEtesting.test_048_check_service_list ________________ self = def test_048_check_service_list(self): response = test_utils.get_ordm_serv_list_request() self.assertEqual(response['status_code'], requests.codes.ok) > self.assertEqual(len(response['service-list']['services']), 2) E AssertionError: 1 != 2 transportpce_tests/hybrid/test02_B100G_end2end.py:895: AssertionError ______________ TransportPCEtesting.test_053_check_otn_topo_links _______________ self = def test_053_check_otn_topo_links(self): response = test_utils.get_ietf_network_request('otn-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) > self.assertEqual(len(response['network'][0]['ietf-network-topology:link']), 4) E AssertionError: 2 != 4 transportpce_tests/hybrid/test02_B100G_end2end.py:920: AssertionError ________________ TransportPCEtesting.test_054_check_otn_topo_tp ________________ self = def test_054_check_otn_topo_tp(self): response = test_utils.get_ietf_network_request('otn-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) for node in response['network'][0]['node']: if node['node-id'] in ('XPDR-A2-XPDR2', 'XPDR-C2-XPDR2'): tpList = node['ietf-network-topology:termination-point'] for tp in tpList: if tp['tp-id'] == 'XPDR2-NETWORK1': > xpdrTpPortConAt = tp['org-openroadm-otn-network-topology:xpdr-tp-port-connection-attributes'] E KeyError: 'org-openroadm-otn-network-topology:xpdr-tp-port-connection-attributes' transportpce_tests/hybrid/test02_B100G_end2end.py:938: KeyError _______________ TransportPCEtesting.test_056_get_100GE_service_3 _______________ self = def test_056_get_100GE_service_3(self): response = test_utils.get_ordm_serv_list_attr_request("services", "service-100GE3") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:965: AssertionError _______________ TransportPCEtesting.test_057_check_service_list ________________ self = def test_057_check_service_list(self): response = test_utils.get_ordm_serv_list_request() self.assertEqual(response['status_code'], requests.codes.ok) > self.assertEqual(len(response['service-list']['services']), 3) E AssertionError: 1 != 3 transportpce_tests/hybrid/test02_B100G_end2end.py:975: AssertionError _______ TransportPCEtesting.test_058_check_interface_100GE_CLIENT_xpdra2 _______ self = def test_058_check_interface_100GE_CLIENT_xpdra2(self): response = test_utils.check_node_attribute_request( 'XPDR-A2', 'interface', 'XPDR2-CLIENT1-ETHERNET-100G') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:980: AssertionError _______ TransportPCEtesting.test_059_check_interface_ODU4_CLIENT_xpdra2 ________ self = def test_059_check_interface_ODU4_CLIENT_xpdra2(self): response = test_utils.check_node_attribute_request( 'XPDR-A2', 'interface', 'XPDR2-CLIENT1-ODU4:service-100GE3') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:997: AssertionError _______ TransportPCEtesting.test_060_check_interface_ODU4_NETWORK_xpdra2 _______ self = def test_060_check_interface_ODU4_NETWORK_xpdra2(self): response = test_utils.check_node_attribute_request( 'XPDR-A2', 'interface', 'XPDR2-NETWORK1-ODU4:service-100GE3') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:1021: AssertionError __________ TransportPCEtesting.test_061_check_ODU4_connection_xpdra2 ___________ self = def test_061_check_ODU4_connection_xpdra2(self): response = test_utils.check_node_attribute_request( 'XPDR-A2', 'odu-connection', 'XPDR2-CLIENT1-ODU4-x-XPDR2-NETWORK1-ODU4') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:1051: AssertionError _______ TransportPCEtesting.test_062_check_interface_100GE_CLIENT_xpdrc2 _______ self = def test_062_check_interface_100GE_CLIENT_xpdrc2(self): response = test_utils.check_node_attribute_request( 'XPDR-C2', 'interface', 'XPDR2-CLIENT2-ETHERNET-100G') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:1068: AssertionError _______ TransportPCEtesting.test_063_check_interface_ODU4_CLIENT_xpdrc2 ________ self = def test_063_check_interface_ODU4_CLIENT_xpdrc2(self): response = test_utils.check_node_attribute_request( 'XPDR-C2', 'interface', 'XPDR2-CLIENT2-ODU4:service-100GE3') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:1085: AssertionError _______ TransportPCEtesting.test_064_check_interface_ODU4_NETWORK_xpdrc2 _______ self = def test_064_check_interface_ODU4_NETWORK_xpdrc2(self): response = test_utils.check_node_attribute_request( 'XPDR-C2', 'interface', 'XPDR2-NETWORK1-ODU4:service-100GE3') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:1109: AssertionError __________ TransportPCEtesting.test_065_check_ODU4_connection_xpdrc2 ___________ self = def test_065_check_ODU4_connection_xpdrc2(self): response = test_utils.check_node_attribute_request( 'XPDR-C2', 'odu-connection', 'XPDR2-CLIENT2-ODU4-x-XPDR2-NETWORK1-ODU4') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/hybrid/test02_B100G_end2end.py:1145: AssertionError ______________ TransportPCEtesting.test_066_check_otn_topo_links _______________ self = def test_066_check_otn_topo_links(self): response = test_utils.get_ietf_network_request('otn-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) > self.assertEqual(len(response['network'][0]['ietf-network-topology:link']), 4) E AssertionError: 2 != 4 transportpce_tests/hybrid/test02_B100G_end2end.py:1162: AssertionError ________________ TransportPCEtesting.test_067_check_otn_topo_tp ________________ self = def test_067_check_otn_topo_tp(self): response = test_utils.get_ietf_network_request('otn-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) for node in response['network'][0]['node']: if node['node-id'] in ('XPDR-A2-XPDR2', 'XPDR-C2-XPDR2'): tpList = node['ietf-network-topology:termination-point'] for tp in tpList: if tp['tp-id'] == 'XPDR2-NETWORK1': > xpdrTpPortConAt = tp['org-openroadm-otn-network-topology:xpdr-tp-port-connection-attributes'] E KeyError: 'org-openroadm-otn-network-topology:xpdr-tp-port-connection-attributes' transportpce_tests/hybrid/test02_B100G_end2end.py:1180: KeyError _____________ TransportPCEtesting.test_068_delete_100GE_service_3 ______________ self = def test_068_delete_100GE_service_3(self): self.del_serv_input_data["service-delete-req-info"]["service-name"] = "service-100GE3" response = test_utils.transportpce_api_rpc_request( 'org-openroadm-service', 'service-delete', self.del_serv_input_data) self.assertEqual(response['status_code'], requests.codes.ok) > self.assertIn('Renderer service delete in progress', response['output']['configuration-response-common']['response-message']) E AssertionError: 'Renderer service delete in progress' not found in "Service 'service-100GE3' does not exist in datastore" transportpce_tests/hybrid/test02_B100G_end2end.py:1193: AssertionError ______________ TransportPCEtesting.test_069_delete_ODUC4_service _______________ self = def test_069_delete_ODUC4_service(self): self.del_serv_input_data["service-delete-req-info"]["service-name"] = "service1-ODUC4" response = test_utils.transportpce_api_rpc_request( 'org-openroadm-service', 'service-delete', self.del_serv_input_data) self.assertEqual(response['status_code'], requests.codes.ok) > self.assertIn('Renderer service delete in progress', response['output']['configuration-response-common']['response-message']) E AssertionError: 'Renderer service delete in progress' not found in "Service 'service1-ODUC4' does not exist in datastore" transportpce_tests/hybrid/test02_B100G_end2end.py:1203: AssertionError =========================== short test summary info ============================ FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_024_get_ODUC4_service1 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_025_check_interface_ODUC4_xpdra2 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_026_check_interface_ODUC4_xpdrc2 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_027_check_otn_topo_links FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_028_check_otn_topo_tp FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_030_get_100GE_service_1 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_031_check_interface_100GE_CLIENT_xpdra2 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_032_check_interface_ODU4_CLIENT_xpdra2 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_033_check_interface_ODU4_NETWORK_xpdra2 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_034_check_ODU4_connection_xpdra2 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_035_check_interface_100GE_CLIENT_xpdrc2 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_036_check_interface_ODU4_CLIENT_xpdrc2 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_037_check_interface_ODU4_NETWORK_xpdrc2 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_038_check_ODU4_connection_xpdrc2 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_039_check_otn_topo_links FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_040_check_otn_topo_tp FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_042_get_100GE_service_2 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_043_check_service_list FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_044_check_otn_topo_links FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_045_check_otn_topo_tp FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_046_delete_100GE_service_2 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_047_delete_100GE_service_1 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_048_check_service_list FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_053_check_otn_topo_links FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_054_check_otn_topo_tp FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_056_get_100GE_service_3 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_057_check_service_list FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_058_check_interface_100GE_CLIENT_xpdra2 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_059_check_interface_ODU4_CLIENT_xpdra2 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_060_check_interface_ODU4_NETWORK_xpdra2 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_061_check_ODU4_connection_xpdra2 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_062_check_interface_100GE_CLIENT_xpdrc2 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_063_check_interface_ODU4_CLIENT_xpdrc2 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_064_check_interface_ODU4_NETWORK_xpdrc2 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_065_check_ODU4_connection_xpdrc2 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_066_check_otn_topo_links FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_067_check_otn_topo_tp FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_068_delete_100GE_service_3 FAILED transportpce_tests/hybrid/test02_B100G_end2end.py::TransportPCEtesting::test_069_delete_ODUC4_service 39 failed, 70 passed in 348.38s (0:05:48) tests_hybrid: exit 1 (860.66 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh hybrid pid=47151 tests_hybrid: FAIL ✖ in 14 minutes 27.45 seconds buildlighty: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt buildlighty: freeze> python -m pip freeze --all buildlighty: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.4.0,cryptography==43.0.1,dict2xml==1.7.6,idna==3.10,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.5.0,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.3,requests==2.32.3,setuptools==75.1.0,urllib3==2.2.3,wheel==0.44.0 buildlighty: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/lighty> ./build.sh NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED [ERROR] COMPILATION ERROR : [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[17,42] cannot find symbol symbol: class YangModuleInfo location: package org.opendaylight.yangtools.binding [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[21,30] cannot find symbol symbol: class YangModuleInfo location: class io.lighty.controllers.tpce.utils.TPCEUtils [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[343,30] cannot find symbol symbol: class YangModuleInfo location: class io.lighty.controllers.tpce.utils.TPCEUtils [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[350,23] cannot find symbol symbol: class YangModuleInfo location: class io.lighty.controllers.tpce.utils.TPCEUtils [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.13.0:compile (default-compile) on project tpce: Compilation failure: Compilation failure: [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[17,42] cannot find symbol [ERROR] symbol: class YangModuleInfo [ERROR] location: package org.opendaylight.yangtools.binding [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[21,30] cannot find symbol [ERROR] symbol: class YangModuleInfo [ERROR] location: class io.lighty.controllers.tpce.utils.TPCEUtils [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[343,30] cannot find symbol [ERROR] symbol: class YangModuleInfo [ERROR] location: class io.lighty.controllers.tpce.utils.TPCEUtils [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[350,23] cannot find symbol [ERROR] symbol: class YangModuleInfo [ERROR] location: class io.lighty.controllers.tpce.utils.TPCEUtils [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException unzip: cannot find or open target/tpce-bin.zip, target/tpce-bin.zip.zip or target/tpce-bin.zip.ZIP. buildlighty: exit 9 (13.69 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/lighty> ./build.sh pid=49866 buildlighty: command failed but is marked ignore outcome so handling it as success buildcontroller: OK (113.52=setup[8.57]+cmd[104.94] seconds) testsPCE: OK (324.60=setup[89.19]+cmd[235.41] seconds) sims: OK (11.65=setup[7.58]+cmd[4.07] seconds) build_karaf_tests121: OK (64.80=setup[7.58]+cmd[57.22] seconds) tests121: FAIL code 1 (603.04=setup[7.03]+cmd[596.01] seconds) build_karaf_tests221: OK (66.10=setup[7.51]+cmd[58.59] seconds) tests_tapi: FAIL code 1 (413.56=setup[18.88]+cmd[394.68] seconds) tests221: FAIL code 1 (1718.98=setup[6.98]+cmd[1712.00] seconds) build_karaf_tests71: OK (58.91=setup[12.47]+cmd[46.44] seconds) tests71: OK (425.50=setup[6.36]+cmd[419.14] seconds) build_karaf_tests_hybrid: OK (74.57=setup[20.36]+cmd[54.21] seconds) tests_hybrid: FAIL code 1 (867.45=setup[6.79]+cmd[860.66] seconds) buildlighty: OK (20.49=setup[6.79]+cmd[13.69] seconds) docs: OK (41.99=setup[38.86]+cmd[3.12] seconds) docs-linkcheck: OK (43.18=setup[39.18]+cmd[4.00] seconds) checkbashisms: OK (3.00=setup[1.97]+cmd[0.01,0.06,0.97] seconds) pre-commit: FAIL code 1 (43.10=setup[3.63]+cmd[0.01,0.01,39.46] seconds) pylint: FAIL code 1 (25.64=setup[4.95]+cmd[20.69] seconds) evaluation failed :( (3625.68 seconds) + tox_status=255 + echo '---> Completed tox runs' ---> Completed tox runs + for i in .tox/*/log ++ echo .tox/build_karaf_tests121/log ++ awk -F/ '{print $2}' + tox_env=build_karaf_tests121 + cp -r .tox/build_karaf_tests121/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests121 + for i in .tox/*/log ++ echo .tox/build_karaf_tests221/log ++ awk -F/ '{print $2}' + tox_env=build_karaf_tests221 + cp -r .tox/build_karaf_tests221/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests221 + for i in .tox/*/log ++ echo .tox/build_karaf_tests71/log ++ awk -F/ '{print $2}' + tox_env=build_karaf_tests71 + cp -r .tox/build_karaf_tests71/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests71 + for i in .tox/*/log ++ echo .tox/build_karaf_tests_hybrid/log ++ awk -F/ '{print $2}' + tox_env=build_karaf_tests_hybrid + cp -r .tox/build_karaf_tests_hybrid/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests_hybrid + for i in .tox/*/log ++ echo .tox/buildcontroller/log ++ awk -F/ '{print $2}' + tox_env=buildcontroller + cp -r .tox/buildcontroller/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/buildcontroller + for i in .tox/*/log ++ awk -F/ '{print $2}' ++ echo .tox/buildlighty/log + tox_env=buildlighty + cp -r .tox/buildlighty/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/buildlighty + for i in .tox/*/log ++ echo .tox/checkbashisms/log ++ awk -F/ '{print $2}' + tox_env=checkbashisms + cp -r .tox/checkbashisms/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/checkbashisms + for i in .tox/*/log ++ echo .tox/docs-linkcheck/log ++ awk -F/ '{print $2}' + tox_env=docs-linkcheck + cp -r .tox/docs-linkcheck/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/docs-linkcheck + for i in .tox/*/log ++ awk -F/ '{print $2}' ++ echo .tox/docs/log + tox_env=docs + cp -r .tox/docs/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/docs + for i in .tox/*/log ++ echo .tox/pre-commit/log ++ awk -F/ '{print $2}' + tox_env=pre-commit + cp -r .tox/pre-commit/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/pre-commit + for i in .tox/*/log ++ echo .tox/pylint/log ++ awk -F/ '{print $2}' + tox_env=pylint + cp -r .tox/pylint/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/pylint + for i in .tox/*/log ++ echo .tox/sims/log ++ awk -F/ '{print $2}' + tox_env=sims + cp -r .tox/sims/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/sims + for i in .tox/*/log ++ echo .tox/tests121/log ++ awk -F/ '{print $2}' + tox_env=tests121 + cp -r .tox/tests121/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests121 + for i in .tox/*/log ++ echo .tox/tests221/log ++ awk -F/ '{print $2}' + tox_env=tests221 + cp -r .tox/tests221/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests221 + for i in .tox/*/log ++ echo .tox/tests71/log ++ awk -F/ '{print $2}' + tox_env=tests71 + cp -r .tox/tests71/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests71 + for i in .tox/*/log ++ echo .tox/testsPCE/log ++ awk -F/ '{print $2}' + tox_env=testsPCE + cp -r .tox/testsPCE/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/testsPCE + for i in .tox/*/log ++ echo .tox/tests_hybrid/log ++ awk -F/ '{print $2}' + tox_env=tests_hybrid + cp -r .tox/tests_hybrid/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests_hybrid + for i in .tox/*/log ++ echo .tox/tests_tapi/log ++ awk -F/ '{print $2}' + tox_env=tests_tapi + cp -r .tox/tests_tapi/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests_tapi + DOC_DIR=docs/_build/html + [[ -d docs/_build/html ]] + echo '---> Archiving generated docs' ---> Archiving generated docs + mv docs/_build/html /w/workspace/transportpce-tox-verify-transportpce-master/archives/docs + echo '---> tox-run.sh ends' ---> tox-run.sh ends + test 255 -eq 0 + exit 255 ++ '[' 1 = 1 ']' ++ '[' -x /usr/bin/clear_console ']' ++ /usr/bin/clear_console -q Build step 'Execute shell' marked build as failure $ ssh-agent -k unset SSH_AUTH_SOCK; unset SSH_AGENT_PID; echo Agent pid 12646 killed; [ssh-agent] Stopped. [PostBuildScript] - [INFO] Executing post build scripts. [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins17282203859324293119.sh ---> sysstat.sh [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins15507419692865772750.sh ---> package-listing.sh ++ facter osfamily ++ tr '[:upper:]' '[:lower:]' + OS_FAMILY=debian + workspace=/w/workspace/transportpce-tox-verify-transportpce-master + START_PACKAGES=/tmp/packages_start.txt + END_PACKAGES=/tmp/packages_end.txt + DIFF_PACKAGES=/tmp/packages_diff.txt + PACKAGES=/tmp/packages_start.txt + '[' /w/workspace/transportpce-tox-verify-transportpce-master ']' + PACKAGES=/tmp/packages_end.txt + case "${OS_FAMILY}" in + dpkg -l + grep '^ii' + '[' -f /tmp/packages_start.txt ']' + '[' -f /tmp/packages_end.txt ']' + diff /tmp/packages_start.txt /tmp/packages_end.txt + '[' /w/workspace/transportpce-tox-verify-transportpce-master ']' + mkdir -p /w/workspace/transportpce-tox-verify-transportpce-master/archives/ + cp -f /tmp/packages_diff.txt /tmp/packages_end.txt /tmp/packages_start.txt /w/workspace/transportpce-tox-verify-transportpce-master/archives/ [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins942842969755034506.sh ---> capture-instance-metadata.sh Setup pyenv: system 3.8.13 3.9.13 3.10.13 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-IHAD from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools lf-activate-venv(): INFO: Adding /tmp/venv-IHAD/bin to PATH INFO: Running in OpenStack, capturing instance metadata [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins6633225887651896155.sh provisioning config files... Could not find credentials [logs] for transportpce-tox-verify-transportpce-master #2080 copy managed file [jenkins-log-archives-settings] to file:/w/workspace/transportpce-tox-verify-transportpce-master@tmp/config12952561958121529462tmp Regular expression run condition: Expression=[^.*logs-s3.*], Label=[odl-logs-s3-cloudfront-index] Run condition [Regular expression match] enabling perform for step [Provide Configuration files] provisioning config files... copy managed file [jenkins-s3-log-ship] to file:/home/jenkins/.aws/credentials [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content SERVER_ID=logs [EnvInject] - Variables injected successfully. [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins15475788267463767062.sh ---> create-netrc.sh WARN: Log server credential not found. [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins6832092850066090121.sh ---> python-tools-install.sh Setup pyenv: system 3.8.13 3.9.13 3.10.13 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-IHAD from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools lf-activate-venv(): INFO: Adding /tmp/venv-IHAD/bin to PATH [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins8962715258314031529.sh ---> sudo-logs.sh Archiving 'sudo' log.. [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins10178861071858253685.sh ---> job-cost.sh Setup pyenv: system 3.8.13 3.9.13 3.10.13 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-IHAD from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: zipp==1.1.0 python-openstackclient urllib3~=1.26.15 lf-activate-venv(): INFO: Adding /tmp/venv-IHAD/bin to PATH INFO: No Stack... INFO: Retrieving Pricing Info for: v3-standard-4 INFO: Archiving Costs [transportpce-tox-verify-transportpce-master] $ /bin/bash -l /tmp/jenkins17096451742403316336.sh ---> logs-deploy.sh Setup pyenv: system 3.8.13 3.9.13 3.10.13 * 3.11.7 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-IHAD from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools lf-activate-venv(): INFO: Adding /tmp/venv-IHAD/bin to PATH WARNING: Nexus logging server not set INFO: S3 path logs/releng/vex-yul-odl-jenkins-1/transportpce-tox-verify-transportpce-master/2080/ INFO: archiving logs to S3 ---> uname -a: Linux prd-ubuntu2004-docker-4c-16g-43136 5.4.0-190-generic #210-Ubuntu SMP Fri Jul 5 17:03:38 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux ---> lscpu: Architecture: x86_64 CPU op-mode(s): 32-bit, 64-bit Byte Order: Little Endian Address sizes: 40 bits physical, 48 bits virtual CPU(s): 4 On-line CPU(s) list: 0-3 Thread(s) per core: 1 Core(s) per socket: 1 Socket(s): 4 NUMA node(s): 1 Vendor ID: AuthenticAMD CPU family: 23 Model: 49 Model name: AMD EPYC-Rome Processor Stepping: 0 CPU MHz: 2800.000 BogoMIPS: 5600.00 Virtualization: AMD-V Hypervisor vendor: KVM Virtualization type: full L1d cache: 128 KiB L1i cache: 128 KiB L2 cache: 2 MiB L3 cache: 64 MiB NUMA node0 CPU(s): 0-3 Vulnerability Gather data sampling: Not affected Vulnerability Itlb multihit: Not affected Vulnerability L1tf: Not affected Vulnerability Mds: Not affected Vulnerability Meltdown: Not affected Vulnerability Mmio stale data: Not affected Vulnerability Retbleed: Vulnerable Vulnerability Spec store bypass: Mitigation; Speculative Store Bypass disabled via prctl and seccomp Vulnerability Spectre v1: Mitigation; usercopy/swapgs barriers and __user pointer sanitization Vulnerability Spectre v2: Mitigation; Retpolines; IBPB conditional; IBRS_FW; STIBP disabled; RSB filling; PBRSB-eIBRS Not affected; BHI Not affected Vulnerability Srbds: Not affected Vulnerability Tsx async abort: Not affected Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy svm cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase tsc_adjust bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 clzero xsaveerptr wbnoinvd arat npt nrip_save umip rdpid arch_capabilities ---> nproc: 4 ---> df -h: Filesystem Size Used Avail Use% Mounted on udev 7.8G 0 7.8G 0% /dev tmpfs 1.6G 1.1M 1.6G 1% /run /dev/vda1 78G 17G 62G 21% / tmpfs 7.9G 0 7.9G 0% /dev/shm tmpfs 5.0M 0 5.0M 0% /run/lock tmpfs 7.9G 0 7.9G 0% /sys/fs/cgroup /dev/loop0 62M 62M 0 100% /snap/core20/1405 /dev/loop2 68M 68M 0 100% /snap/lxd/22753 /dev/vda15 105M 6.1M 99M 6% /boot/efi tmpfs 1.6G 0 1.6G 0% /run/user/1001 /dev/loop3 39M 39M 0 100% /snap/snapd/21759 /dev/loop4 64M 64M 0 100% /snap/core20/2379 /dev/loop5 92M 92M 0 100% /snap/lxd/29619 ---> free -m: total used free shared buff/cache available Mem: 15997 650 7104 1 8242 15007 Swap: 1023 0 1023 ---> ip addr: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: ens3: mtu 1458 qdisc mq state UP group default qlen 1000 link/ether fa:16:3e:52:98:d7 brd ff:ff:ff:ff:ff:ff inet 10.30.171.136/23 brd 10.30.171.255 scope global dynamic ens3 valid_lft 82606sec preferred_lft 82606sec inet6 fe80::f816:3eff:fe52:98d7/64 scope link valid_lft forever preferred_lft forever 3: docker0: mtu 1458 qdisc noqueue state DOWN group default link/ether 02:42:ff:d6:db:f2 brd ff:ff:ff:ff:ff:ff inet 10.250.0.254/24 brd 10.250.0.255 scope global docker0 valid_lft forever preferred_lft forever ---> sar -b -r -n DEV: Linux 5.4.0-190-generic (prd-ubuntu2004-docker-4c-16g-43136) 10/16/24 _x86_64_ (4 CPU) 14:12:39 LINUX RESTART (4 CPU) 14:13:02 tps rtps wtps dtps bread/s bwrtn/s bdscd/s 14:14:01 358.43 159.02 199.41 0.00 11732.66 47258.97 0.00 14:15:01 149.18 23.05 126.13 0.00 1849.83 20274.37 0.00 14:16:01 167.52 18.85 148.68 0.00 925.18 47236.53 0.00 14:17:01 70.85 1.52 69.34 0.00 65.86 35125.61 0.00 14:18:01 183.08 4.60 178.48 0.00 221.22 152001.60 0.00 14:19:01 216.18 12.05 204.13 0.00 4726.41 48459.26 0.00 14:20:01 75.32 2.13 73.19 0.00 70.64 10280.57 0.00 14:21:01 145.09 1.52 143.58 0.00 172.10 2304.82 0.00 14:22:01 83.10 0.50 82.60 0.00 84.52 1349.38 0.00 14:23:01 60.84 2.55 58.29 0.00 457.39 9106.22 0.00 14:24:01 3.30 0.00 3.30 0.00 0.00 76.39 0.00 14:25:01 50.79 0.20 50.59 0.00 20.66 1771.57 0.00 14:26:01 126.50 0.08 126.41 0.00 2.40 9667.99 0.00 14:27:01 2.43 0.00 2.43 0.00 0.00 49.86 0.00 14:28:01 3.18 0.92 2.27 0.00 22.13 42.53 0.00 14:29:01 86.10 0.00 86.10 0.00 0.00 1452.18 0.00 14:30:01 2.05 0.00 2.05 0.00 0.00 39.06 0.00 14:31:02 71.74 0.00 71.74 0.00 0.00 1038.89 0.00 14:32:01 27.74 0.05 27.69 0.00 1.90 650.06 0.00 14:33:01 68.42 0.02 68.41 0.00 0.13 2975.77 0.00 14:34:01 80.54 0.02 80.52 0.00 0.13 1423.26 0.00 14:35:01 57.92 0.00 57.92 0.00 0.00 934.38 0.00 14:36:01 1.92 0.00 1.92 0.00 0.00 37.06 0.00 14:37:01 2.30 0.00 2.30 0.00 0.00 27.46 0.00 14:38:01 1.45 0.00 1.45 0.00 0.00 18.26 0.00 14:39:01 2.17 0.00 2.17 0.00 0.00 28.00 0.00 14:40:01 53.02 0.00 53.02 0.00 0.00 785.60 0.00 14:41:01 71.35 0.00 71.35 0.00 0.00 1099.42 0.00 14:42:01 1.83 0.00 1.83 0.00 0.00 29.73 0.00 14:43:01 72.80 0.00 72.80 0.00 0.00 1079.29 0.00 14:44:01 43.26 0.00 43.26 0.00 0.00 589.77 0.00 14:45:01 58.81 0.00 58.81 0.00 0.00 839.46 0.00 14:46:01 2.12 0.00 2.12 0.00 0.00 46.92 0.00 14:47:01 66.84 0.00 66.84 0.00 0.00 944.78 0.00 14:48:01 2.77 0.00 2.77 0.00 0.00 59.59 0.00 14:49:01 1.75 0.00 1.75 0.00 0.00 35.46 0.00 14:50:01 1.65 0.00 1.65 0.00 0.00 22.26 0.00 14:51:01 1.47 0.00 1.47 0.00 0.00 18.13 0.00 14:52:01 1.68 0.00 1.68 0.00 0.00 22.53 0.00 14:53:01 58.44 0.00 58.44 0.00 0.00 842.79 0.00 14:54:01 2.30 0.00 2.30 0.00 0.00 54.26 0.00 14:55:01 2.42 0.00 2.42 0.00 0.00 34.13 0.00 14:56:01 1.80 0.00 1.80 0.00 0.00 26.52 0.00 14:57:01 2.38 0.00 2.38 0.00 0.00 42.53 0.00 14:58:01 2.07 0.00 2.07 0.00 0.00 40.39 0.00 14:59:01 2.27 0.00 2.27 0.00 0.00 32.39 0.00 15:00:01 1.85 0.00 1.85 0.00 0.00 33.99 0.00 15:01:01 57.49 0.00 57.49 0.00 0.00 4317.28 0.00 15:02:01 45.18 0.03 45.14 0.00 0.27 5639.86 0.00 15:03:01 2.18 0.00 2.18 0.00 0.00 46.26 0.00 15:04:01 1.75 0.00 1.75 0.00 0.00 21.86 0.00 15:05:01 2.05 0.00 2.05 0.00 0.00 26.26 0.00 15:06:01 1.43 0.00 1.43 0.00 0.00 19.20 0.00 15:07:01 2.30 0.00 2.30 0.00 0.00 28.80 0.00 15:08:01 1.30 0.00 1.30 0.00 0.00 16.00 0.00 15:09:01 15.43 0.00 15.43 0.00 0.00 248.23 0.00 15:10:01 63.10 0.00 63.10 0.00 0.00 1116.16 0.00 15:11:01 2.30 0.00 2.30 0.00 0.00 40.26 0.00 15:12:01 1.57 0.00 1.57 0.00 0.00 23.06 0.00 15:13:01 3.12 0.00 3.12 0.00 0.00 53.46 0.00 15:14:01 2.07 0.00 2.07 0.00 0.00 42.53 0.00 15:15:01 11.27 3.87 7.40 0.00 78.27 486.80 0.00 Average: 44.50 3.68 40.81 0.00 326.57 6643.69 0.00 14:13:02 kbmemfree kbavail kbmemused %memused kbbuffers kbcached kbcommit %commit kbactive kbinact kbdirty 14:14:01 13380504 15399576 585592 3.57 61844 2158668 1344176 7.71 851500 1883072 150976 14:15:01 11937824 14973344 986208 6.02 106104 3063552 1939400 11.13 1401340 2682380 678308 14:16:01 10553880 14424100 1534048 9.36 139200 3795704 2282312 13.09 2027932 3368452 139492 14:17:01 7338204 13438152 2518584 15.37 176976 5888344 3490876 20.03 3531932 4974476 1316616 14:18:01 6836632 15007692 941844 5.75 202484 7841168 2019772 11.59 2638324 6271112 63928 14:19:01 4674300 14119532 1824324 11.14 227208 9049948 2632664 15.10 4058048 6962716 258652 14:20:01 184244 9003112 6937892 42.35 222784 8441016 8168568 46.87 9043944 6456996 324 14:21:01 170200 8690160 7250188 44.26 227256 8140624 8574668 49.20 9356092 6161472 136 14:22:01 971504 9485868 6454552 39.40 230680 8129948 7490144 42.97 8569100 6150244 31920 14:23:01 173272 7424696 8515188 51.98 236564 6880200 9725648 55.80 10530564 4996948 1576 14:24:01 174464 7425864 8514012 51.97 236632 6880200 9725648 55.80 10530040 4996676 116 14:25:01 4308796 11799944 4141824 25.28 244128 7105824 5156520 29.58 6234320 5167844 224152 14:26:01 4015428 11510740 4431276 27.05 246588 7107316 5367116 30.79 6555048 5140228 508 14:27:01 3991540 11487024 4454980 27.20 246624 7107456 5383140 30.88 6578124 5140268 348 14:28:01 3973044 11469568 4472568 27.30 246684 7108268 5432740 31.17 6596532 5139060 116 14:29:01 3259388 10758292 5183632 31.64 248916 7108344 6093452 34.96 7320284 5128032 460 14:30:01 3223112 10722308 5219580 31.86 248968 7108580 6125468 35.14 7356180 5128260 196 14:31:02 4054692 11555956 4386196 26.78 250844 7108708 5295628 30.38 6530876 5127356 456 14:32:01 6910700 14479068 1464164 8.94 253292 7169264 2892316 16.59 3625596 5183388 63324 14:33:01 4802288 12371848 3571248 21.80 254288 7169452 4447812 25.52 5728740 5181292 196 14:34:01 7302808 14874032 1070444 6.53 255500 7169812 1894348 10.87 3251756 5168140 516 14:35:01 4401460 11974272 3968616 24.23 256828 7170052 5056684 29.01 6141820 5168372 36 14:36:01 4117436 11690612 4252128 25.96 256840 7170388 5122232 29.39 6424304 5168656 76 14:37:01 4116720 11689908 4252832 25.96 256848 7170392 5122232 29.39 6424568 5168656 56 14:38:01 4115980 11689204 4253536 25.97 256876 7170400 5122232 29.39 6424592 5168664 52 14:39:01 4114484 11687744 4254960 25.97 256908 7170404 5138428 29.48 6427152 5168668 100 14:40:01 6232388 13806964 2136780 13.04 257980 7170632 2914980 16.72 4318152 5167668 92 14:41:01 6218868 13795152 2148712 13.12 259368 7170956 2942680 16.88 4331756 5167828 300 14:42:01 6209836 13786180 2157652 13.17 259400 7170984 2974700 17.07 4340996 5167840 84 14:43:01 6937160 14514704 1429464 8.73 259952 7171616 2774088 15.92 3615704 5167660 844 14:44:01 5996972 13575028 2368736 14.46 260304 7171768 3156620 18.11 4553208 5167744 124 14:45:01 5539512 13118456 2824984 17.25 260808 7172136 3691116 21.18 5007304 5167984 428 14:46:01 7781032 15360116 584568 3.57 260824 7172256 1730208 9.93 2776188 5168036 280 14:47:01 4119832 11700160 4242352 25.90 261520 7172784 5130220 29.43 6420524 5168544 592 14:48:01 3940928 11521500 4420764 26.99 261528 7173024 5228592 30.00 6599000 5168776 176 14:49:01 3921492 11502360 4439948 27.10 261548 7173296 5244612 30.09 6618156 5169052 208 14:50:01 3920872 11501748 4440520 27.11 261552 7173300 5244612 30.09 6618240 5169056 72 14:51:01 3920880 11501760 4440536 27.11 261556 7173304 5244612 30.09 6618344 5169056 244 14:52:01 3912644 11493584 4448816 27.16 261568 7173352 5244612 30.09 6627368 5169104 164 14:53:01 4103456 11685520 4257004 25.99 262316 7173688 5097004 29.24 6436160 5169376 608 14:54:01 3952724 11535120 4407072 26.90 262336 7174004 5211712 29.90 6587124 5169460 128 14:55:01 3942988 11525552 4416612 26.96 262352 7174148 5227772 29.99 6595760 5169604 92 14:56:01 3940140 11522716 4419516 26.98 262352 7174156 5227772 29.99 6597176 5169604 76 14:57:01 3928652 11511456 4430752 27.05 262376 7174356 5227772 29.99 6610260 5169804 96 14:58:01 3916168 11499284 4442824 27.12 262384 7174680 5227772 29.99 6621964 5170112 300 14:59:01 3913572 11496728 4445380 27.14 262404 7174680 5227772 29.99 6622456 5170128 88 15:00:01 3903552 11486924 4455088 27.20 262432 7174900 5227772 29.99 6634316 5170320 124 15:01:01 4582984 12405492 3537096 21.59 268668 7400060 4969144 28.51 5775988 5345236 146392 15:02:01 3626000 11450816 4491172 27.42 268980 7401904 5334928 30.61 6734956 5339032 156 15:03:01 3580684 11405756 4535988 27.69 268988 7402136 5350924 30.70 6778644 5339184 532 15:04:01 3581096 11406172 4535572 27.69 268988 7402140 5350924 30.70 6778296 5339188 56 15:05:01 3581324 11406404 4535324 27.69 268988 7402152 5350924 30.70 6778556 5339192 296 15:06:01 3580852 11405952 4535764 27.69 268992 7402160 5350924 30.70 6778788 5339208 116 15:07:01 3580852 11405960 4535756 27.69 268996 7402164 5350924 30.70 6778800 5339212 56 15:08:01 3580120 11405232 4536472 27.69 268996 7402164 5350924 30.70 6778976 5339212 264 15:09:01 7140588 14965792 977852 5.97 269028 7402172 1817676 10.43 3246412 5328228 672 15:10:01 3674244 11500368 4441404 27.11 269572 7402496 5217808 29.94 6707688 5317676 208 15:11:01 3657896 11484100 4457644 27.21 269576 7402568 5250392 30.12 6723520 5317688 88 15:12:01 3655684 11481900 4459892 27.23 269584 7402580 5250392 30.12 6724736 5317684 112 15:13:01 3630256 11456716 4485052 27.38 269592 7402808 5250392 30.12 6751968 5317916 264 15:14:01 3614444 11441244 4500508 27.47 269604 7403132 5266416 30.21 6767820 5318104 72 15:15:01 7019536 15047432 896188 5.47 274156 7586452 1713076 9.83 3195248 5485516 187772 Average: 4603921 12068757 3875164 23.66 248023 7084438 4780919 27.43 5945311 5176136 52820 14:13:02 IFACE rxpck/s txpck/s rxkB/s txkB/s rxcmp/s txcmp/s rxmcst/s %ifutil 14:14:01 lo 0.68 0.68 0.07 0.07 0.00 0.00 0.00 0.00 14:14:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:14:01 ens3 444.40 284.54 1661.79 73.51 0.00 0.00 0.00 0.00 14:15:01 lo 3.86 3.86 0.37 0.37 0.00 0.00 0.00 0.00 14:15:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:15:01 ens3 223.85 182.64 3447.74 21.92 0.00 0.00 0.00 0.00 14:16:01 lo 2.80 2.80 0.30 0.30 0.00 0.00 0.00 0.00 14:16:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:16:01 ens3 348.96 297.23 5109.85 31.58 0.00 0.00 0.00 0.00 14:17:01 lo 1.27 1.27 0.12 0.12 0.00 0.00 0.00 0.00 14:17:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:17:01 ens3 313.68 225.73 5136.60 23.62 0.00 0.00 0.00 0.00 14:18:01 lo 0.73 0.73 0.08 0.08 0.00 0.00 0.00 0.00 14:18:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:18:01 ens3 133.68 68.08 2409.15 6.55 0.00 0.00 0.00 0.00 14:19:01 lo 5.15 5.15 1.23 1.23 0.00 0.00 0.00 0.00 14:19:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:19:01 ens3 157.59 84.62 2086.51 5.79 0.00 0.00 0.00 0.00 14:20:01 lo 26.24 26.24 29.05 29.05 0.00 0.00 0.00 0.00 14:20:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:20:01 ens3 1.43 1.28 0.24 0.22 0.00 0.00 0.00 0.00 14:21:01 lo 29.18 29.18 20.64 20.64 0.00 0.00 0.00 0.00 14:21:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:21:01 ens3 1.40 1.22 0.24 0.22 0.00 0.00 0.00 0.00 14:22:01 lo 31.86 31.86 13.31 13.31 0.00 0.00 0.00 0.00 14:22:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:22:01 ens3 2.33 2.13 0.88 0.76 0.00 0.00 0.00 0.00 14:23:01 lo 8.00 8.00 6.44 6.44 0.00 0.00 0.00 0.00 14:23:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:23:01 ens3 0.63 0.48 0.08 0.08 0.00 0.00 0.00 0.00 14:24:01 lo 0.10 0.10 0.01 0.01 0.00 0.00 0.00 0.00 14:24:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:24:01 ens3 0.43 0.12 0.15 0.07 0.00 0.00 0.00 0.00 14:25:01 lo 1.55 1.55 0.13 0.13 0.00 0.00 0.00 0.00 14:25:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:25:01 ens3 3.12 2.47 1.00 1.16 0.00 0.00 0.00 0.00 14:26:01 lo 21.26 21.26 26.85 26.85 0.00 0.00 0.00 0.00 14:26:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:26:01 ens3 1.57 1.35 0.49 0.42 0.00 0.00 0.00 0.00 14:27:01 lo 24.95 24.95 8.28 8.28 0.00 0.00 0.00 0.00 14:27:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:27:01 ens3 1.23 1.52 0.23 0.24 0.00 0.00 0.00 0.00 14:28:01 lo 22.11 22.11 6.78 6.78 0.00 0.00 0.00 0.00 14:28:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:28:01 ens3 2.23 1.43 0.33 0.25 0.00 0.00 0.00 0.00 14:29:01 lo 18.84 18.84 11.24 11.24 0.00 0.00 0.00 0.00 14:29:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:29:01 ens3 2.77 2.30 0.99 0.72 0.00 0.00 0.00 0.00 14:30:01 lo 42.96 42.96 14.08 14.08 0.00 0.00 0.00 0.00 14:30:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:30:01 ens3 1.73 1.57 0.50 0.44 0.00 0.00 0.00 0.00 14:31:02 lo 13.03 13.03 6.62 6.62 0.00 0.00 0.00 0.00 14:31:02 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:31:02 ens3 1.18 1.17 0.18 0.17 0.00 0.00 0.00 0.00 14:32:01 lo 18.40 18.40 5.56 5.56 0.00 0.00 0.00 0.00 14:32:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:32:01 ens3 8.24 8.02 2.41 4.66 0.00 0.00 0.00 0.00 14:33:01 lo 15.21 15.21 10.09 10.09 0.00 0.00 0.00 0.00 14:33:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:33:01 ens3 2.35 1.62 0.63 0.49 0.00 0.00 0.00 0.00 14:34:01 lo 14.05 14.05 4.74 4.74 0.00 0.00 0.00 0.00 14:34:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:34:01 ens3 1.42 1.37 0.56 0.43 0.00 0.00 0.00 0.00 14:35:01 lo 9.58 9.58 4.57 4.57 0.00 0.00 0.00 0.00 14:35:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:35:01 ens3 0.95 0.97 0.14 0.13 0.00 0.00 0.00 0.00 14:36:01 lo 18.90 18.90 8.36 8.36 0.00 0.00 0.00 0.00 14:36:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:36:01 ens3 0.80 0.92 0.15 0.15 0.00 0.00 0.00 0.00 14:37:01 lo 0.58 0.58 0.06 0.06 0.00 0.00 0.00 0.00 14:37:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:37:01 ens3 0.38 0.10 0.02 0.01 0.00 0.00 0.00 0.00 14:38:01 lo 0.25 0.25 0.02 0.02 0.00 0.00 0.00 0.00 14:38:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:38:01 ens3 0.20 0.00 0.01 0.00 0.00 0.00 0.00 0.00 14:39:01 lo 2.45 2.45 0.87 0.87 0.00 0.00 0.00 0.00 14:39:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:39:01 ens3 1.15 0.57 0.26 0.14 0.00 0.00 0.00 0.00 14:40:01 lo 5.73 5.73 6.67 6.67 0.00 0.00 0.00 0.00 14:40:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:40:01 ens3 1.17 1.18 0.40 0.35 0.00 0.00 0.00 0.00 14:41:01 lo 7.85 7.85 2.73 2.73 0.00 0.00 0.00 0.00 14:41:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:41:01 ens3 0.73 0.73 0.11 0.11 0.00 0.00 0.00 0.00 14:42:01 lo 3.77 3.77 0.94 0.94 0.00 0.00 0.00 0.00 14:42:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:42:01 ens3 0.47 0.58 0.09 0.10 0.00 0.00 0.00 0.00 14:43:01 lo 29.90 29.90 9.70 9.70 0.00 0.00 0.00 0.00 14:43:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:43:01 ens3 1.00 1.03 0.14 0.15 0.00 0.00 0.00 0.00 14:44:01 lo 16.20 16.20 11.89 11.89 0.00 0.00 0.00 0.00 14:44:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:44:01 ens3 1.02 1.18 0.29 0.23 0.00 0.00 0.00 0.00 14:45:01 lo 11.80 11.80 12.89 12.89 0.00 0.00 0.00 0.00 14:45:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:45:01 ens3 0.87 0.83 0.12 0.12 0.00 0.00 0.00 0.00 14:46:01 lo 18.23 18.23 10.02 10.02 0.00 0.00 0.00 0.00 14:46:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:46:01 ens3 1.07 1.33 0.21 0.21 0.00 0.00 0.00 0.00 14:47:01 lo 21.96 21.96 9.27 9.27 0.00 0.00 0.00 0.00 14:47:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:47:01 ens3 0.98 0.97 0.14 0.13 0.00 0.00 0.00 0.00 14:48:01 lo 31.63 31.63 10.70 10.70 0.00 0.00 0.00 0.00 14:48:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:48:01 ens3 0.87 1.00 0.17 0.17 0.00 0.00 0.00 0.00 14:49:01 lo 36.29 36.29 10.63 10.63 0.00 0.00 0.00 0.00 14:49:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:49:01 ens3 0.75 0.68 0.21 0.14 0.00 0.00 0.00 0.00 14:50:01 lo 0.27 0.27 0.02 0.02 0.00 0.00 0.00 0.00 14:50:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:50:01 ens3 1.02 0.40 0.33 0.22 0.00 0.00 0.00 0.00 14:51:01 lo 0.58 0.58 0.06 0.06 0.00 0.00 0.00 0.00 14:51:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:51:01 ens3 0.80 0.12 0.07 0.01 0.00 0.00 0.00 0.00 14:52:01 lo 3.68 3.68 1.38 1.38 0.00 0.00 0.00 0.00 14:52:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:52:01 ens3 0.63 0.42 0.33 0.26 0.00 0.00 0.00 0.00 14:53:01 lo 16.56 16.56 15.27 15.27 0.00 0.00 0.00 0.00 14:53:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:53:01 ens3 0.88 0.73 0.11 0.10 0.00 0.00 0.00 0.00 14:54:01 lo 17.93 17.93 8.10 8.10 0.00 0.00 0.00 0.00 14:54:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:54:01 ens3 1.45 1.07 0.33 0.21 0.00 0.00 0.00 0.00 14:55:01 lo 3.25 3.25 1.40 1.40 0.00 0.00 0.00 0.00 14:55:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:55:01 ens3 0.90 1.00 0.36 0.32 0.00 0.00 0.00 0.00 14:56:01 lo 3.73 3.73 3.06 3.06 0.00 0.00 0.00 0.00 14:56:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:56:01 ens3 1.40 1.08 0.44 0.34 0.00 0.00 0.00 0.00 14:57:01 lo 19.91 19.91 7.48 7.48 0.00 0.00 0.00 0.00 14:57:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:57:01 ens3 0.72 0.75 0.12 0.11 0.00 0.00 0.00 0.00 14:58:01 lo 15.11 15.11 6.02 6.02 0.00 0.00 0.00 0.00 14:58:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:58:01 ens3 0.50 0.45 0.07 0.07 0.00 0.00 0.00 0.00 14:59:01 lo 3.35 3.35 1.40 1.40 0.00 0.00 0.00 0.00 14:59:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:59:01 ens3 0.93 1.00 0.26 0.20 0.00 0.00 0.00 0.00 15:00:01 lo 19.86 19.86 7.62 7.62 0.00 0.00 0.00 0.00 15:00:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 15:00:01 ens3 0.62 0.62 0.10 0.10 0.00 0.00 0.00 0.00 15:01:01 lo 4.53 4.53 1.44 1.44 0.00 0.00 0.00 0.00 15:01:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 15:01:01 ens3 2.98 3.55 1.02 1.33 0.00 0.00 0.00 0.00 15:02:01 lo 44.38 44.38 37.36 37.36 0.00 0.00 0.00 0.00 15:02:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 15:02:01 ens3 0.90 1.17 0.18 0.19 0.00 0.00 0.00 0.00 15:03:01 lo 27.41 27.41 10.26 10.26 0.00 0.00 0.00 0.00 15:03:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 15:03:01 ens3 1.08 1.27 0.19 0.21 0.00 0.00 0.00 0.00 15:04:01 lo 0.27 0.27 0.02 0.02 0.00 0.00 0.00 0.00 15:04:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 15:04:01 ens3 0.37 0.22 0.14 0.07 0.00 0.00 0.00 0.00 15:05:01 lo 0.60 0.60 0.06 0.06 0.00 0.00 0.00 0.00 15:05:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 15:05:01 ens3 0.25 0.10 0.01 0.01 0.00 0.00 0.00 0.00 15:06:01 lo 0.45 0.45 0.04 0.04 0.00 0.00 0.00 0.00 15:06:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 15:06:01 ens3 0.20 0.10 0.02 0.02 0.00 0.00 0.00 0.00 15:07:01 lo 0.40 0.40 0.04 0.04 0.00 0.00 0.00 0.00 15:07:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 15:07:01 ens3 0.30 0.10 0.02 0.01 0.00 0.00 0.00 0.00 15:08:01 lo 0.12 0.12 0.01 0.01 0.00 0.00 0.00 0.00 15:08:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 15:08:01 ens3 0.68 0.00 0.07 0.00 0.00 0.00 0.00 0.00 15:09:01 lo 1.50 1.50 0.13 0.13 0.00 0.00 0.00 0.00 15:09:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 15:09:01 ens3 0.97 0.87 0.48 0.36 0.00 0.00 0.00 0.00 15:10:01 lo 33.21 33.21 17.31 17.31 0.00 0.00 0.00 0.00 15:10:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 15:10:01 ens3 1.15 0.67 0.16 0.11 0.00 0.00 0.00 0.00 15:11:01 lo 5.48 5.48 1.89 1.89 0.00 0.00 0.00 0.00 15:11:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 15:11:01 ens3 1.08 0.92 0.39 0.33 0.00 0.00 0.00 0.00 15:12:01 lo 6.00 6.00 2.83 2.83 0.00 0.00 0.00 0.00 15:12:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 15:12:01 ens3 1.07 0.97 0.21 0.21 0.00 0.00 0.00 0.00 15:13:01 lo 27.40 27.40 8.62 8.62 0.00 0.00 0.00 0.00 15:13:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 15:13:01 ens3 1.00 0.82 0.18 0.16 0.00 0.00 0.00 0.00 15:14:01 lo 20.73 20.73 8.03 8.03 0.00 0.00 0.00 0.00 15:14:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 15:14:01 ens3 0.90 0.65 0.26 0.17 0.00 0.00 0.00 0.00 15:15:01 lo 35.85 35.85 11.87 11.87 0.00 0.00 0.00 0.00 15:15:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 15:15:01 ens3 147.00 116.17 1947.83 12.94 0.00 0.00 0.00 0.00 Average: lo 13.39 13.39 6.73 6.73 0.00 0.00 0.00 0.00 Average: docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 Average: ens3 29.51 21.19 351.65 3.10 0.00 0.00 0.00 0.00 ---> sar -P ALL: Linux 5.4.0-190-generic (prd-ubuntu2004-docker-4c-16g-43136) 10/16/24 _x86_64_ (4 CPU) 14:12:39 LINUX RESTART (4 CPU) 14:13:02 CPU %user %nice %system %iowait %steal %idle 14:14:01 all 18.79 16.63 13.52 5.33 0.12 45.61 14:14:01 0 21.04 15.38 12.67 4.08 0.12 46.71 14:14:01 1 20.76 15.84 12.91 5.58 0.12 44.79 14:14:01 2 16.90 16.68 14.00 6.32 0.12 45.99 14:14:01 3 16.47 18.64 14.52 5.34 0.10 44.93 14:15:01 all 33.88 0.00 3.33 8.26 0.10 54.43 14:15:01 0 23.76 0.00 2.64 5.31 0.13 68.15 14:15:01 1 34.02 0.00 3.26 12.04 0.07 50.61 14:15:01 2 21.33 0.00 3.02 9.12 0.10 66.43 14:15:01 3 56.53 0.00 4.41 6.55 0.10 32.40 14:16:01 all 80.03 0.00 2.89 4.14 0.13 12.81 14:16:01 0 82.45 0.00 3.23 7.57 0.14 6.62 14:16:01 1 78.93 0.00 2.34 3.69 0.10 14.95 14:16:01 2 74.01 0.00 3.19 3.59 0.12 19.10 14:16:01 3 84.80 0.00 2.81 1.71 0.17 10.50 14:17:01 all 59.58 0.00 3.43 1.27 0.09 35.64 14:17:01 0 54.11 0.00 3.00 0.79 0.08 42.03 14:17:01 1 78.00 0.00 4.47 1.81 0.10 15.62 14:17:01 2 52.60 0.00 2.86 2.05 0.10 42.39 14:17:01 3 53.59 0.00 3.38 0.44 0.08 42.51 14:18:01 all 49.94 0.00 3.39 29.47 0.12 17.08 14:18:01 0 52.91 0.00 2.90 11.62 0.15 32.42 14:18:01 1 53.43 0.00 3.51 29.32 0.14 13.60 14:18:01 2 52.99 0.00 3.20 37.86 0.10 5.85 14:18:01 3 40.57 0.00 3.93 38.76 0.08 16.65 14:19:01 all 86.45 0.00 3.99 4.38 0.17 5.00 14:19:01 0 86.66 0.00 4.35 5.04 0.16 3.80 14:19:01 1 87.09 0.00 4.01 2.46 0.17 6.28 14:19:01 2 85.45 0.00 3.89 6.81 0.15 3.70 14:19:01 3 86.62 0.00 3.73 3.24 0.19 6.21 14:20:01 all 58.58 0.00 2.23 0.55 0.11 38.52 14:20:01 0 58.32 0.00 2.54 0.47 0.12 38.55 14:20:01 1 61.85 0.00 2.08 0.77 0.10 35.19 14:20:01 2 54.08 0.00 1.68 0.05 0.12 44.07 14:20:01 3 60.10 0.00 2.61 0.92 0.12 36.24 14:21:01 all 48.23 0.00 1.90 0.70 0.10 49.06 14:21:01 0 49.97 0.00 1.84 0.50 0.12 47.57 14:21:01 1 48.09 0.00 2.12 1.96 0.12 47.71 14:21:01 2 47.53 0.00 1.53 0.22 0.08 50.63 14:21:01 3 47.34 0.00 2.11 0.13 0.10 50.32 14:22:01 all 25.73 0.00 1.09 0.43 0.09 72.67 14:22:01 0 26.52 0.00 0.99 0.07 0.08 72.34 14:22:01 1 26.38 0.00 1.36 1.58 0.08 70.60 14:22:01 2 23.92 0.00 0.92 0.03 0.10 75.03 14:22:01 3 26.09 0.00 1.09 0.03 0.10 72.68 14:23:01 all 34.34 0.00 1.51 1.29 0.11 62.75 14:23:01 0 31.32 0.00 1.13 0.17 0.15 67.23 14:23:01 1 31.70 0.00 1.82 3.66 0.10 62.71 14:23:01 2 38.12 0.00 1.29 1.04 0.08 59.47 14:23:01 3 36.21 0.00 1.81 0.27 0.10 61.62 14:24:01 all 1.29 0.00 0.36 0.01 0.07 98.27 14:24:01 0 1.19 0.00 0.37 0.00 0.07 98.38 14:24:01 1 1.14 0.00 0.40 0.00 0.08 98.38 14:24:01 2 1.00 0.00 0.35 0.02 0.07 98.56 14:24:01 3 1.82 0.00 0.31 0.02 0.07 97.78 14:24:01 CPU %user %nice %system %iowait %steal %idle 14:25:01 all 31.02 0.00 1.48 0.24 0.10 67.16 14:25:01 0 26.19 0.00 1.50 0.10 0.08 72.13 14:25:01 1 38.84 0.00 1.52 0.37 0.10 59.17 14:25:01 2 28.39 0.00 1.78 0.49 0.12 69.22 14:25:01 3 30.65 0.00 1.13 0.00 0.10 68.12 14:26:01 all 38.72 0.00 1.48 0.83 0.12 58.86 14:26:01 0 37.62 0.00 1.56 0.54 0.12 60.16 14:26:01 1 40.40 0.00 1.49 2.43 0.12 55.55 14:26:01 2 40.28 0.00 1.47 0.02 0.10 58.13 14:26:01 3 36.59 0.00 1.37 0.32 0.13 61.59 14:27:01 all 4.10 0.00 0.45 0.02 0.08 95.35 14:27:01 0 4.63 0.00 0.52 0.02 0.07 94.77 14:27:01 1 4.03 0.00 0.48 0.05 0.08 95.36 14:27:01 2 3.89 0.00 0.34 0.00 0.07 95.71 14:27:01 3 3.88 0.00 0.47 0.00 0.08 95.57 14:28:01 all 3.37 0.00 0.45 21.87 0.07 74.24 14:28:01 0 2.82 0.00 0.37 30.86 0.07 65.89 14:28:01 1 3.16 0.00 0.45 30.67 0.05 65.67 14:28:01 2 4.08 0.00 0.53 12.96 0.08 82.34 14:28:01 3 3.40 0.00 0.47 13.05 0.07 83.01 14:29:01 all 38.72 0.00 1.40 7.02 0.12 52.74 14:29:01 0 38.92 0.00 1.31 6.71 0.12 52.95 14:29:01 1 38.48 0.00 1.49 7.25 0.12 52.66 14:29:01 2 37.91 0.00 1.60 6.84 0.13 53.51 14:29:01 3 39.58 0.00 1.19 7.28 0.12 51.84 14:30:01 all 5.67 0.00 0.57 0.01 0.10 93.66 14:30:01 0 5.47 0.00 0.60 0.00 0.13 93.80 14:30:01 1 4.24 0.00 0.55 0.00 0.07 95.15 14:30:01 2 6.95 0.00 0.56 0.00 0.10 92.39 14:30:01 3 6.02 0.00 0.55 0.03 0.08 93.31 14:31:02 all 31.76 0.00 1.29 0.95 0.12 65.88 14:31:02 0 30.81 0.00 1.21 0.13 0.13 67.72 14:31:02 1 32.27 0.00 1.35 0.49 0.10 65.78 14:31:02 2 31.60 0.00 1.13 2.53 0.08 64.66 14:31:02 3 32.36 0.00 1.48 0.65 0.17 65.34 14:32:01 all 26.46 0.00 1.31 0.42 0.09 71.72 14:32:01 0 24.43 0.00 1.19 0.02 0.09 74.28 14:32:01 1 30.41 0.00 1.21 0.26 0.09 68.04 14:32:01 2 23.82 0.00 1.26 0.02 0.10 74.80 14:32:01 3 27.20 0.00 1.57 1.38 0.10 69.74 14:33:01 all 26.97 0.00 0.90 0.49 0.08 71.55 14:33:01 0 29.04 0.00 1.26 0.00 0.08 69.62 14:33:01 1 25.91 0.00 0.81 0.02 0.07 73.20 14:33:01 2 25.52 0.00 0.67 0.00 0.10 73.70 14:33:01 3 27.42 0.00 0.88 1.93 0.07 69.70 14:34:01 all 49.35 0.00 1.86 0.31 0.10 48.38 14:34:01 0 48.40 0.00 2.25 0.40 0.12 48.83 14:34:01 1 51.59 0.00 1.54 0.60 0.10 46.16 14:34:01 2 49.89 0.00 1.82 0.07 0.08 48.13 14:34:01 3 47.52 0.00 1.81 0.15 0.10 50.41 14:35:01 all 37.59 0.00 1.09 0.33 0.09 60.90 14:35:01 0 38.73 0.00 0.84 0.30 0.08 60.05 14:35:01 1 39.32 0.00 1.40 0.55 0.10 58.63 14:35:01 2 35.50 0.00 1.24 0.00 0.10 63.16 14:35:01 3 36.79 0.00 0.89 0.45 0.08 61.79 14:35:01 CPU %user %nice %system %iowait %steal %idle 14:36:01 all 6.43 0.00 0.26 0.01 0.07 93.24 14:36:01 0 6.57 0.00 0.35 0.02 0.07 93.00 14:36:01 1 6.76 0.00 0.27 0.02 0.07 92.89 14:36:01 2 6.49 0.00 0.25 0.00 0.08 93.17 14:36:01 3 5.88 0.00 0.15 0.00 0.07 93.90 14:37:01 all 0.34 0.00 0.10 0.01 0.06 99.48 14:37:01 0 0.33 0.00 0.03 0.02 0.05 99.56 14:37:01 1 0.20 0.00 0.07 0.03 0.07 99.63 14:37:01 2 0.47 0.00 0.13 0.00 0.05 99.35 14:37:01 3 0.37 0.00 0.17 0.00 0.07 99.40 14:38:01 all 0.44 0.00 0.12 0.00 0.07 99.38 14:38:01 0 0.88 0.00 0.15 0.02 0.08 98.87 14:38:01 1 0.32 0.00 0.08 0.00 0.05 99.55 14:38:01 2 0.33 0.00 0.12 0.00 0.07 99.48 14:38:01 3 0.20 0.00 0.12 0.00 0.07 99.61 14:39:01 all 0.98 0.00 0.15 0.01 0.07 98.79 14:39:01 0 2.27 0.00 0.12 0.02 0.07 97.54 14:39:01 1 0.42 0.00 0.08 0.03 0.05 99.42 14:39:01 2 0.65 0.00 0.22 0.00 0.08 99.05 14:39:01 3 0.55 0.00 0.18 0.00 0.07 99.20 14:40:01 all 31.16 0.00 1.10 0.22 0.08 67.44 14:40:01 0 30.89 0.00 1.40 0.42 0.08 67.20 14:40:01 1 32.17 0.00 1.26 0.03 0.08 66.45 14:40:01 2 32.15 0.00 0.80 0.35 0.07 66.63 14:40:01 3 29.43 0.00 0.92 0.08 0.08 69.48 14:41:01 all 31.09 0.00 1.10 0.29 0.09 67.43 14:41:01 0 29.89 0.00 1.09 0.08 0.08 68.85 14:41:01 1 32.38 0.00 1.26 0.00 0.10 66.26 14:41:01 2 30.56 0.00 1.21 1.07 0.08 67.08 14:41:01 3 31.54 0.00 0.84 0.02 0.08 67.52 14:42:01 all 1.09 0.00 0.13 0.02 0.05 98.71 14:42:01 0 1.27 0.00 0.13 0.00 0.03 98.56 14:42:01 1 1.10 0.00 0.17 0.00 0.08 98.65 14:42:01 2 1.02 0.00 0.15 0.05 0.05 98.73 14:42:01 3 0.97 0.00 0.07 0.02 0.05 98.89 14:43:01 all 59.94 0.00 1.93 0.78 0.18 37.17 14:43:01 0 58.08 0.00 2.22 0.17 0.15 39.38 14:43:01 1 59.97 0.00 2.00 1.23 0.15 36.65 14:43:01 2 59.43 0.00 1.91 1.49 0.28 36.88 14:43:01 3 62.30 0.00 1.59 0.22 0.12 35.77 14:44:01 all 15.28 0.00 0.58 0.42 0.07 83.65 14:44:01 0 14.06 0.00 0.52 0.10 0.07 85.26 14:44:01 1 16.97 0.00 0.47 0.20 0.07 82.30 14:44:01 2 14.45 0.00 0.54 1.34 0.08 83.60 14:44:01 3 15.63 0.00 0.78 0.05 0.07 83.46 14:45:01 all 37.12 0.00 1.29 0.41 0.10 61.09 14:45:01 0 38.50 0.00 1.77 0.42 0.10 59.22 14:45:01 1 39.66 0.00 1.16 0.02 0.10 59.06 14:45:01 2 37.75 0.00 1.08 1.18 0.10 59.90 14:45:01 3 32.60 0.00 1.15 0.02 0.10 66.13 14:46:01 all 7.55 0.00 0.38 0.05 0.06 91.96 14:46:01 0 7.70 0.00 0.30 0.00 0.05 91.95 14:46:01 1 7.59 0.00 0.35 0.00 0.07 91.99 14:46:01 2 7.38 0.00 0.42 0.22 0.07 91.92 14:46:01 3 7.53 0.00 0.45 0.00 0.05 91.97 14:46:01 CPU %user %nice %system %iowait %steal %idle 14:47:01 all 50.82 0.00 1.45 0.32 0.12 47.29 14:47:01 0 48.08 0.00 1.44 0.27 0.12 50.10 14:47:01 1 49.04 0.00 1.60 0.20 0.13 49.02 14:47:01 2 53.50 0.00 1.39 0.80 0.12 44.19 14:47:01 3 52.65 0.00 1.39 0.02 0.12 45.83 14:48:01 all 6.50 0.00 0.45 0.03 0.08 92.95 14:48:01 0 6.19 0.00 0.50 0.00 0.07 93.25 14:48:01 1 6.74 0.00 0.37 0.03 0.07 92.80 14:48:01 2 6.23 0.00 0.57 0.07 0.08 93.06 14:48:01 3 6.84 0.00 0.38 0.00 0.08 92.70 14:49:01 all 4.47 0.00 0.33 0.01 0.06 95.14 14:49:01 0 3.96 0.00 0.33 0.00 0.05 95.65 14:49:01 1 4.22 0.00 0.33 0.02 0.07 95.36 14:49:01 2 4.09 0.00 0.28 0.02 0.05 95.56 14:49:01 3 5.57 0.00 0.38 0.00 0.07 93.98 14:50:01 all 1.11 0.00 0.22 0.01 0.06 98.61 14:50:01 0 0.52 0.00 0.22 0.00 0.07 99.20 14:50:01 1 0.58 0.00 0.20 0.02 0.07 99.13 14:50:01 2 0.55 0.00 0.15 0.02 0.05 99.23 14:50:01 3 2.75 0.00 0.29 0.00 0.07 96.89 14:51:01 all 0.80 0.00 0.25 0.04 0.07 98.83 14:51:01 0 0.59 0.00 0.18 0.00 0.05 99.18 14:51:01 1 0.67 0.00 0.23 0.00 0.08 99.02 14:51:01 2 0.52 0.00 0.32 0.15 0.08 98.93 14:51:01 3 1.44 0.00 0.28 0.00 0.08 98.20 14:52:01 all 1.05 0.00 0.24 0.02 0.05 98.63 14:52:01 0 0.57 0.00 0.12 0.00 0.03 99.28 14:52:01 1 0.84 0.00 0.18 0.02 0.05 98.91 14:52:01 2 1.17 0.00 0.35 0.07 0.07 98.35 14:52:01 3 1.63 0.00 0.32 0.00 0.07 97.99 14:53:01 all 51.50 0.00 1.56 0.38 0.09 46.47 14:53:01 0 50.80 0.00 1.67 0.22 0.10 47.22 14:53:01 1 49.06 0.00 1.42 0.00 0.08 49.43 14:53:01 2 53.85 0.00 1.81 1.26 0.10 42.97 14:53:01 3 52.31 0.00 1.34 0.03 0.08 46.24 14:54:01 all 6.83 0.00 0.32 0.01 0.07 92.76 14:54:01 0 7.17 0.00 0.35 0.00 0.08 92.40 14:54:01 1 7.25 0.00 0.30 0.02 0.07 92.37 14:54:01 2 6.67 0.00 0.32 0.03 0.07 92.91 14:54:01 3 6.24 0.00 0.33 0.00 0.07 93.36 14:55:01 all 1.12 0.00 0.20 0.01 0.05 98.61 14:55:01 0 1.42 0.00 0.27 0.00 0.05 98.26 14:55:01 1 0.87 0.00 0.18 0.03 0.07 98.85 14:55:01 2 1.57 0.00 0.32 0.02 0.07 98.03 14:55:01 3 0.60 0.00 0.05 0.00 0.03 99.32 14:56:01 all 0.92 0.00 0.23 0.00 0.06 98.79 14:56:01 0 1.15 0.00 0.38 0.00 0.08 98.38 14:56:01 1 0.89 0.00 0.10 0.00 0.03 98.98 14:56:01 2 0.85 0.00 0.22 0.02 0.07 98.85 14:56:01 3 0.80 0.00 0.20 0.00 0.05 98.95 14:57:01 all 3.25 0.00 0.34 6.94 0.08 89.39 14:57:01 0 3.12 0.00 0.37 0.22 0.07 96.23 14:57:01 1 3.12 0.00 0.30 1.02 0.12 95.44 14:57:01 2 3.45 0.00 0.38 26.24 0.07 69.87 14:57:01 3 3.31 0.00 0.32 0.20 0.07 96.10 14:57:01 CPU %user %nice %system %iowait %steal %idle 14:58:01 all 1.84 0.00 0.26 0.03 0.07 97.81 14:58:01 0 1.69 0.00 0.23 0.00 0.07 98.01 14:58:01 1 1.94 0.00 0.27 0.08 0.05 97.66 14:58:01 2 2.01 0.00 0.30 0.03 0.08 97.57 14:58:01 3 1.73 0.00 0.22 0.00 0.07 97.99 14:59:01 all 0.89 0.00 0.20 0.01 0.05 98.85 14:59:01 0 0.74 0.00 0.15 0.00 0.03 99.08 14:59:01 1 0.64 0.00 0.12 0.05 0.03 99.16 14:59:01 2 0.90 0.00 0.22 0.00 0.07 98.81 14:59:01 3 1.28 0.00 0.32 0.00 0.05 98.35 15:00:01 all 2.51 0.00 0.26 0.03 0.07 97.13 15:00:01 0 2.19 0.00 0.23 0.00 0.07 97.51 15:00:01 1 1.97 0.00 0.32 0.13 0.08 97.49 15:00:01 2 3.79 0.00 0.25 0.00 0.05 95.91 15:00:01 3 2.09 0.00 0.23 0.00 0.07 97.61 15:01:01 all 44.64 0.00 1.83 0.67 0.10 52.75 15:01:01 0 52.96 0.00 2.01 0.47 0.12 44.45 15:01:01 1 44.41 0.00 1.93 0.77 0.10 52.80 15:01:01 2 43.53 0.00 2.04 1.37 0.12 52.94 15:01:01 3 37.72 0.00 1.36 0.07 0.08 60.77 15:02:01 all 21.62 0.00 0.68 0.84 0.09 76.78 15:02:01 0 22.00 0.00 0.59 0.57 0.10 76.75 15:02:01 1 22.68 0.00 0.77 0.57 0.10 75.88 15:02:01 2 21.01 0.00 0.57 1.74 0.08 76.59 15:02:01 3 20.78 0.00 0.79 0.49 0.07 77.89 15:03:01 all 3.92 0.00 0.35 0.01 0.06 95.66 15:03:01 0 3.72 0.00 0.38 0.00 0.07 95.83 15:03:01 1 3.89 0.00 0.28 0.05 0.05 95.73 15:03:01 2 3.83 0.00 0.28 0.00 0.07 95.82 15:03:01 3 4.23 0.00 0.45 0.00 0.07 95.25 15:04:01 all 0.77 0.00 0.25 0.00 0.07 98.91 15:04:01 0 0.83 0.00 0.32 0.00 0.08 98.77 15:04:01 1 0.50 0.00 0.18 0.02 0.07 99.23 15:04:01 2 0.35 0.00 0.20 0.00 0.05 99.40 15:04:01 3 1.40 0.00 0.28 0.00 0.08 98.24 15:05:01 all 0.78 0.00 0.20 0.01 0.06 98.95 15:05:01 0 0.53 0.00 0.20 0.00 0.05 99.22 15:05:01 1 0.25 0.00 0.10 0.05 0.07 99.53 15:05:01 2 0.40 0.00 0.07 0.00 0.05 99.48 15:05:01 3 1.93 0.00 0.43 0.00 0.08 97.56 15:06:01 all 0.43 0.00 0.19 0.00 0.05 99.33 15:06:01 0 0.38 0.00 0.20 0.00 0.05 99.37 15:06:01 1 0.35 0.00 0.13 0.02 0.03 99.47 15:06:01 2 0.17 0.00 0.10 0.00 0.03 99.70 15:06:01 3 0.80 0.00 0.34 0.00 0.07 98.79 15:07:01 all 0.41 0.00 0.24 0.01 0.06 99.28 15:07:01 0 0.45 0.00 0.30 0.00 0.07 99.18 15:07:01 1 0.42 0.00 0.22 0.03 0.05 99.28 15:07:01 2 0.33 0.00 0.23 0.00 0.05 99.38 15:07:01 3 0.45 0.00 0.20 0.00 0.07 99.28 15:08:01 all 0.38 0.00 0.23 0.01 0.06 99.32 15:08:01 0 0.43 0.00 0.22 0.00 0.08 99.26 15:08:01 1 0.44 0.00 0.32 0.03 0.07 99.15 15:08:01 2 0.27 0.00 0.15 0.00 0.05 99.53 15:08:01 3 0.40 0.00 0.22 0.00 0.05 99.33 15:08:01 CPU %user %nice %system %iowait %steal %idle 15:09:01 all 7.96 0.00 0.65 0.05 0.10 91.24 15:09:01 0 8.31 0.00 0.72 0.00 0.10 90.87 15:09:01 1 8.00 0.00 0.43 0.05 0.10 91.41 15:09:01 2 7.64 0.00 0.60 0.05 0.12 91.59 15:09:01 3 7.88 0.00 0.83 0.10 0.10 91.09 15:10:01 all 50.16 0.00 1.25 0.27 0.12 48.20 15:10:01 0 52.20 0.00 1.31 0.02 0.13 46.34 15:10:01 1 48.11 0.00 1.32 0.43 0.12 50.03 15:10:01 2 53.19 0.00 1.07 0.07 0.10 45.57 15:10:01 3 47.16 0.00 1.30 0.55 0.12 50.87 15:11:01 all 1.72 0.00 0.14 0.07 0.07 98.00 15:11:01 0 1.98 0.00 0.22 0.00 0.10 97.70 15:11:01 1 2.03 0.00 0.08 0.25 0.05 97.59 15:11:01 2 1.46 0.00 0.07 0.00 0.05 98.43 15:11:01 3 1.43 0.00 0.20 0.02 0.07 98.29 15:12:01 all 1.68 0.00 0.18 0.16 0.05 97.92 15:12:01 0 1.17 0.00 0.17 0.00 0.05 98.61 15:12:01 1 1.51 0.00 0.25 0.15 0.07 98.02 15:12:01 2 0.79 0.00 0.08 0.50 0.03 98.59 15:12:01 3 3.24 0.00 0.23 0.00 0.05 96.48 15:13:01 all 4.28 0.00 0.23 0.02 0.07 95.41 15:13:01 0 3.90 0.00 0.22 0.00 0.07 95.82 15:13:01 1 3.76 0.00 0.27 0.05 0.07 95.86 15:13:01 2 4.74 0.00 0.23 0.02 0.07 94.94 15:13:01 3 4.73 0.00 0.18 0.00 0.07 95.02 15:14:01 all 3.23 0.00 0.15 0.01 0.07 96.54 15:14:01 0 2.85 0.00 0.18 0.00 0.07 96.90 15:14:01 1 2.73 0.00 0.22 0.02 0.08 96.95 15:14:01 2 4.18 0.00 0.08 0.02 0.05 95.67 15:14:01 3 3.13 0.00 0.13 0.00 0.07 96.67 15:15:01 all 17.15 0.00 1.04 0.34 0.08 81.39 15:15:01 0 12.84 0.00 1.19 0.05 0.08 85.83 15:15:01 1 24.97 0.00 1.06 0.39 0.08 73.50 15:15:01 2 17.58 0.00 0.99 0.80 0.08 80.55 15:15:01 3 13.20 0.00 0.94 0.12 0.07 85.68 Average: all 20.50 0.26 1.14 1.62 0.08 76.40 Average: 0 20.22 0.24 1.14 1.25 0.09 77.07 Average: 1 21.23 0.25 1.14 1.79 0.08 75.50 Average: 2 20.03 0.26 1.10 2.07 0.08 76.45 Average: 3 20.51 0.29 1.17 1.36 0.08 76.58