17:42:01 Triggered by Gerrit: https://git.opendaylight.org/gerrit/c/transportpce/+/120111 17:42:01 Running as SYSTEM 17:42:01 [EnvInject] - Loading node environment variables. 17:42:01 Building remotely on prd-ubuntu2204-docker-4c-16g-260 (ubuntu2204-docker-4c-16g) in workspace /w/workspace/transportpce-tox-verify-transportpce-master 17:42:01 [ssh-agent] Looking for ssh-agent implementation... 17:42:01 [ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine) 17:42:01 $ ssh-agent 17:42:01 SSH_AUTH_SOCK=/tmp/ssh-XXXXXXjIHnVX/agent.1576 17:42:01 SSH_AGENT_PID=1578 17:42:01 [ssh-agent] Started. 17:42:01 Running ssh-add (command line suppressed) 17:42:01 Identity added: /w/workspace/transportpce-tox-verify-transportpce-master@tmp/private_key_1538059868374359907.key (/w/workspace/transportpce-tox-verify-transportpce-master@tmp/private_key_1538059868374359907.key) 17:42:01 [ssh-agent] Using credentials jenkins (jenkins-ssh) 17:42:01 The recommended git tool is: NONE 17:42:03 using credential jenkins-ssh 17:42:03 Wiping out workspace first. 17:42:03 Cloning the remote Git repository 17:42:03 Cloning repository git://devvexx.opendaylight.org/mirror/transportpce 17:42:03 > git init /w/workspace/transportpce-tox-verify-transportpce-master # timeout=10 17:42:03 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/transportpce 17:42:03 > git --version # timeout=10 17:42:03 > git --version # 'git version 2.34.1' 17:42:03 using GIT_SSH to set credentials jenkins-ssh 17:42:03 Verifying host key using known hosts file 17:42:03 You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. 17:42:03 > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/transportpce +refs/heads/*:refs/remotes/origin/* # timeout=10 17:42:07 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/transportpce # timeout=10 17:42:07 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 17:42:07 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/transportpce # timeout=10 17:42:07 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/transportpce 17:42:07 using GIT_SSH to set credentials jenkins-ssh 17:42:07 Verifying host key using known hosts file 17:42:07 You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. 17:42:07 > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/transportpce refs/changes/11/120111/1 # timeout=10 17:42:08 > git rev-parse 7bab5df6d60805e7323ddae08846d99e4973abce^{commit} # timeout=10 17:42:08 JENKINS-19022: warning: possible memory leak due to Git plugin usage; see: https://plugins.jenkins.io/git/#remove-git-plugin-buildsbybranch-builddata-script 17:42:08 Checking out Revision 7bab5df6d60805e7323ddae08846d99e4973abce (refs/changes/11/120111/1) 17:42:08 > git config core.sparsecheckout # timeout=10 17:42:08 > git checkout -f 7bab5df6d60805e7323ddae08846d99e4973abce # timeout=10 17:42:11 Commit message: "Bump the lightynode version" 17:42:11 > git rev-parse FETCH_HEAD^{commit} # timeout=10 17:42:11 > git rev-list --no-walk 6ec1623684600d226fdd9b9483b3e7d8bb6c6232 # timeout=10 17:42:11 > git remote # timeout=10 17:42:11 > git submodule init # timeout=10 17:42:11 > git submodule sync # timeout=10 17:42:11 > git config --get remote.origin.url # timeout=10 17:42:11 > git submodule init # timeout=10 17:42:11 > git config -f .gitmodules --get-regexp ^submodule\.(.+)\.url # timeout=10 17:42:11 ERROR: No submodules found. 17:42:11 provisioning config files... 17:42:11 copy managed file [npmrc] to file:/home/jenkins/.npmrc 17:42:11 copy managed file [pipconf] to file:/home/jenkins/.config/pip/pip.conf 17:42:11 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins9233895976128284596.sh 17:42:11 ---> python-tools-install.sh 17:42:11 Setup pyenv: 17:42:12 * system (set by /opt/pyenv/version) 17:42:12 * 3.8.20 (set by /opt/pyenv/version) 17:42:12 * 3.9.20 (set by /opt/pyenv/version) 17:42:12 3.10.15 17:42:12 3.11.10 17:42:16 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-6Heh 17:42:16 lf-activate-venv(): INFO: Save venv in file: /tmp/.os_lf_venv 17:42:16 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 17:42:16 lf-activate-venv(): INFO: Attempting to install with network-safe options... 17:42:20 lf-activate-venv(): INFO: Base packages installed successfully 17:42:20 lf-activate-venv(): INFO: Installing additional packages: lftools 17:42:49 lf-activate-venv(): INFO: Adding /tmp/venv-6Heh/bin to PATH 17:42:49 Generating Requirements File 17:43:12 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 17:43:12 httplib2 0.31.1 requires pyparsing<4,>=3.0.4, but you have pyparsing 2.4.7 which is incompatible. 17:43:13 Python 3.11.10 17:43:13 pip 25.3 from /tmp/venv-6Heh/lib/python3.11/site-packages/pip (python 3.11) 17:43:13 appdirs==1.4.4 17:43:13 argcomplete==3.6.3 17:43:13 aspy.yaml==1.3.0 17:43:13 attrs==25.4.0 17:43:13 autopage==0.5.2 17:43:13 beautifulsoup4==4.14.3 17:43:13 boto3==1.42.27 17:43:13 botocore==1.42.27 17:43:13 bs4==0.0.2 17:43:13 certifi==2026.1.4 17:43:13 cffi==2.0.0 17:43:13 cfgv==3.5.0 17:43:13 chardet==5.2.0 17:43:13 charset-normalizer==3.4.4 17:43:13 click==8.3.1 17:43:13 cliff==4.13.1 17:43:13 cmd2==3.1.0 17:43:13 cryptography==3.3.2 17:43:13 debtcollector==3.0.0 17:43:13 decorator==5.2.1 17:43:13 defusedxml==0.7.1 17:43:13 Deprecated==1.3.1 17:43:13 distlib==0.4.0 17:43:13 dnspython==2.8.0 17:43:13 docker==7.1.0 17:43:13 dogpile.cache==1.5.0 17:43:13 durationpy==0.10 17:43:13 email-validator==2.3.0 17:43:13 filelock==3.20.3 17:43:13 future==1.0.0 17:43:13 gitdb==4.0.12 17:43:13 GitPython==3.1.46 17:43:13 google-auth==2.47.0 17:43:13 httplib2==0.31.1 17:43:13 identify==2.6.16 17:43:13 idna==3.11 17:43:13 importlib-resources==1.5.0 17:43:13 iso8601==2.1.0 17:43:13 Jinja2==3.1.6 17:43:13 jmespath==1.0.1 17:43:13 jsonpatch==1.33 17:43:13 jsonpointer==3.0.0 17:43:13 jsonschema==4.26.0 17:43:13 jsonschema-specifications==2025.9.1 17:43:13 keystoneauth1==5.12.0 17:43:13 kubernetes==34.1.0 17:43:13 lftools==0.37.19 17:43:13 lxml==6.0.2 17:43:13 markdown-it-py==4.0.0 17:43:13 MarkupSafe==3.0.3 17:43:13 mdurl==0.1.2 17:43:13 msgpack==1.1.2 17:43:13 multi_key_dict==2.0.3 17:43:13 munch==4.0.0 17:43:13 netaddr==1.3.0 17:43:13 niet==1.4.2 17:43:13 nodeenv==1.10.0 17:43:13 oauth2client==4.1.3 17:43:13 oauthlib==3.3.1 17:43:13 openstacksdk==4.8.0 17:43:13 os-service-types==1.8.2 17:43:13 osc-lib==4.3.0 17:43:13 oslo.config==10.2.0 17:43:13 oslo.context==6.2.0 17:43:13 oslo.i18n==6.7.1 17:43:13 oslo.log==8.0.0 17:43:13 oslo.serialization==5.9.0 17:43:13 oslo.utils==9.2.0 17:43:13 packaging==25.0 17:43:13 pbr==7.0.3 17:43:13 platformdirs==4.5.1 17:43:13 prettytable==3.17.0 17:43:13 psutil==7.2.1 17:43:13 pyasn1==0.6.1 17:43:13 pyasn1_modules==0.4.2 17:43:13 pycparser==2.23 17:43:13 pygerrit2==2.0.15 17:43:13 PyGithub==2.8.1 17:43:13 Pygments==2.19.2 17:43:13 PyJWT==2.10.1 17:43:13 PyNaCl==1.6.2 17:43:13 pyparsing==2.4.7 17:43:13 pyperclip==1.11.0 17:43:13 pyrsistent==0.20.0 17:43:13 python-cinderclient==9.8.0 17:43:13 python-dateutil==2.9.0.post0 17:43:13 python-heatclient==4.3.0 17:43:13 python-jenkins==1.8.3 17:43:13 python-keystoneclient==5.7.0 17:43:13 python-magnumclient==4.9.0 17:43:13 python-openstackclient==8.3.0 17:43:13 python-swiftclient==4.9.0 17:43:13 PyYAML==6.0.3 17:43:13 referencing==0.37.0 17:43:13 requests==2.32.5 17:43:13 requests-oauthlib==2.0.0 17:43:13 requestsexceptions==1.4.0 17:43:13 rfc3986==2.0.0 17:43:13 rich==14.2.0 17:43:13 rich-argparse==1.7.2 17:43:13 rpds-py==0.30.0 17:43:13 rsa==4.9.1 17:43:13 ruamel.yaml==0.19.1 17:43:13 ruamel.yaml.clib==0.2.15 17:43:13 s3transfer==0.16.0 17:43:13 simplejson==3.20.2 17:43:13 six==1.17.0 17:43:13 smmap==5.0.2 17:43:13 soupsieve==2.8.1 17:43:13 stevedore==5.6.0 17:43:13 tabulate==0.9.0 17:43:13 toml==0.10.2 17:43:13 tomlkit==0.14.0 17:43:13 tqdm==4.67.1 17:43:13 typing_extensions==4.15.0 17:43:13 tzdata==2025.3 17:43:13 urllib3==1.26.20 17:43:13 virtualenv==20.36.1 17:43:13 wcwidth==0.2.14 17:43:13 websocket-client==1.9.0 17:43:13 wrapt==2.0.1 17:43:13 xdg==6.0.0 17:43:13 xmltodict==1.0.2 17:43:13 yq==3.4.3 17:43:13 [EnvInject] - Injecting environment variables from a build step. 17:43:13 [EnvInject] - Injecting as environment variables the properties content 17:43:13 PYTHON=python3 17:43:13 17:43:13 [EnvInject] - Variables injected successfully. 17:43:13 [transportpce-tox-verify-transportpce-master] $ /bin/bash -l /tmp/jenkins11387148118685381278.sh 17:43:13 ---> tox-install.sh 17:43:13 + source /home/jenkins/lf-env.sh 17:43:13 + lf-activate-venv --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 17:43:13 ++ mktemp -d /tmp/venv-XXXX 17:43:13 + lf_venv=/tmp/venv-Xpbr 17:43:13 + local venv_file=/tmp/.os_lf_venv 17:43:13 + local python=python3 17:43:13 + local options 17:43:13 + local set_path=true 17:43:13 + local install_args= 17:43:13 ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 17:43:13 + options=' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' 17:43:13 + eval set -- ' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' 17:43:13 ++ set -- --venv-file /tmp/.toxenv -- tox virtualenv urllib3~=1.26.15 17:43:13 + true 17:43:13 + case $1 in 17:43:13 + venv_file=/tmp/.toxenv 17:43:13 + shift 2 17:43:13 + true 17:43:13 + case $1 in 17:43:13 + shift 17:43:13 + break 17:43:13 + case $python in 17:43:13 + local pkg_list= 17:43:13 + [[ -d /opt/pyenv ]] 17:43:13 + echo 'Setup pyenv:' 17:43:13 Setup pyenv: 17:43:13 + export PYENV_ROOT=/opt/pyenv 17:43:13 + PYENV_ROOT=/opt/pyenv 17:43:13 + export PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 17:43:13 + PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 17:43:13 + pyenv versions 17:43:13 system 17:43:13 3.8.20 17:43:13 3.9.20 17:43:13 3.10.15 17:43:13 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 17:43:13 + command -v pyenv 17:43:13 ++ pyenv init - --no-rehash 17:43:13 + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); 17:43:13 for i in ${!paths[@]}; do 17:43:13 if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; 17:43:13 fi; done; 17:43:13 echo "${paths[*]}"'\'')" 17:43:13 export PATH="/opt/pyenv/shims:${PATH}" 17:43:13 export PYENV_SHELL=bash 17:43:13 source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' 17:43:13 pyenv() { 17:43:13 local command 17:43:13 command="${1:-}" 17:43:13 if [ "$#" -gt 0 ]; then 17:43:13 shift 17:43:13 fi 17:43:13 17:43:13 case "$command" in 17:43:13 rehash|shell) 17:43:13 eval "$(pyenv "sh-$command" "$@")" 17:43:13 ;; 17:43:13 *) 17:43:13 command pyenv "$command" "$@" 17:43:13 ;; 17:43:13 esac 17:43:13 }' 17:43:13 +++ bash --norc -ec 'IFS=:; paths=($PATH); 17:43:13 for i in ${!paths[@]}; do 17:43:13 if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; 17:43:13 fi; done; 17:43:13 echo "${paths[*]}"' 17:43:13 ++ PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 17:43:13 ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 17:43:13 ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 17:43:13 ++ export PYENV_SHELL=bash 17:43:13 ++ PYENV_SHELL=bash 17:43:13 ++ source /opt/pyenv/libexec/../completions/pyenv.bash 17:43:13 +++ complete -F _pyenv pyenv 17:43:13 ++ lf-pyver python3 17:43:13 ++ local py_version_xy=python3 17:43:13 ++ local py_version_xyz= 17:43:13 ++ pyenv versions 17:43:13 ++ local command 17:43:13 ++ awk '{ print $1 }' 17:43:13 ++ command=versions 17:43:13 ++ '[' 1 -gt 0 ']' 17:43:13 ++ shift 17:43:13 ++ case "$command" in 17:43:13 ++ command pyenv versions 17:43:13 ++ sed 's/^[ *]* //' 17:43:13 ++ grep -E '^[0-9.]*[0-9]$' 17:43:14 ++ [[ ! -s /tmp/.pyenv_versions ]] 17:43:14 +++ grep '^3' /tmp/.pyenv_versions 17:43:14 +++ sort -V 17:43:14 +++ tail -n 1 17:43:14 ++ py_version_xyz=3.11.10 17:43:14 ++ [[ -z 3.11.10 ]] 17:43:14 ++ echo 3.11.10 17:43:14 ++ return 0 17:43:14 + pyenv local 3.11.10 17:43:14 + local command 17:43:14 + command=local 17:43:14 + '[' 2 -gt 0 ']' 17:43:14 + shift 17:43:14 + case "$command" in 17:43:14 + command pyenv local 3.11.10 17:43:14 + for arg in "$@" 17:43:14 + case $arg in 17:43:14 + pkg_list+='tox ' 17:43:14 + for arg in "$@" 17:43:14 + case $arg in 17:43:14 + pkg_list+='virtualenv ' 17:43:14 + for arg in "$@" 17:43:14 + case $arg in 17:43:14 + pkg_list+='urllib3~=1.26.15 ' 17:43:14 + [[ -f /tmp/.toxenv ]] 17:43:14 + [[ ! -f /tmp/.toxenv ]] 17:43:14 + [[ -n '' ]] 17:43:14 + python3 -m venv /tmp/venv-Xpbr 17:43:18 + echo 'lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-Xpbr' 17:43:18 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-Xpbr 17:43:18 + echo /tmp/venv-Xpbr 17:43:18 + echo 'lf-activate-venv(): INFO: Save venv in file: /tmp/.toxenv' 17:43:18 lf-activate-venv(): INFO: Save venv in file: /tmp/.toxenv 17:43:18 + echo 'lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv)' 17:43:18 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 17:43:18 + local 'pip_opts=--upgrade --quiet' 17:43:18 + pip_opts='--upgrade --quiet --trusted-host pypi.org' 17:43:18 + pip_opts='--upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org' 17:43:18 + pip_opts='--upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org' 17:43:18 + [[ -n '' ]] 17:43:18 + [[ -n '' ]] 17:43:18 + echo 'lf-activate-venv(): INFO: Attempting to install with network-safe options...' 17:43:18 lf-activate-venv(): INFO: Attempting to install with network-safe options... 17:43:18 + /tmp/venv-Xpbr/bin/python3 -m pip install --upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org pip 'setuptools<66' virtualenv 17:43:22 + echo 'lf-activate-venv(): INFO: Base packages installed successfully' 17:43:22 lf-activate-venv(): INFO: Base packages installed successfully 17:43:22 + [[ -z tox virtualenv urllib3~=1.26.15 ]] 17:43:22 + echo 'lf-activate-venv(): INFO: Installing additional packages: tox virtualenv urllib3~=1.26.15 ' 17:43:22 lf-activate-venv(): INFO: Installing additional packages: tox virtualenv urllib3~=1.26.15 17:43:22 + /tmp/venv-Xpbr/bin/python3 -m pip install --upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org --upgrade-strategy eager tox virtualenv urllib3~=1.26.15 17:43:24 + type python3 17:43:24 + true 17:43:24 + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-Xpbr/bin to PATH' 17:43:24 lf-activate-venv(): INFO: Adding /tmp/venv-Xpbr/bin to PATH 17:43:24 + PATH=/tmp/venv-Xpbr/bin:/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 17:43:24 + return 0 17:43:24 + python3 --version 17:43:24 Python 3.11.10 17:43:24 + python3 -m pip --version 17:43:24 pip 25.3 from /tmp/venv-Xpbr/lib/python3.11/site-packages/pip (python 3.11) 17:43:24 + python3 -m pip freeze 17:43:24 cachetools==6.2.4 17:43:24 chardet==5.2.0 17:43:24 colorama==0.4.6 17:43:24 distlib==0.4.0 17:43:24 filelock==3.20.3 17:43:24 packaging==25.0 17:43:24 platformdirs==4.5.1 17:43:24 pluggy==1.6.0 17:43:24 pyproject-api==1.10.0 17:43:24 tox==4.34.1 17:43:24 urllib3==1.26.20 17:43:24 virtualenv==20.36.1 17:43:24 [transportpce-tox-verify-transportpce-master] $ /bin/sh -xe /tmp/jenkins701067016748249680.sh 17:43:24 [EnvInject] - Injecting environment variables from a build step. 17:43:24 [EnvInject] - Injecting as environment variables the properties content 17:43:24 PARALLEL=True 17:43:24 17:43:24 [EnvInject] - Variables injected successfully. 17:43:24 [transportpce-tox-verify-transportpce-master] $ /bin/bash -l /tmp/jenkins16031120757645718059.sh 17:43:24 ---> tox-run.sh 17:43:24 + PATH=/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 17:43:24 + ARCHIVE_TOX_DIR=/w/workspace/transportpce-tox-verify-transportpce-master/archives/tox 17:43:24 + ARCHIVE_DOC_DIR=/w/workspace/transportpce-tox-verify-transportpce-master/archives/docs 17:43:24 + mkdir -p /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox 17:43:24 + cd /w/workspace/transportpce-tox-verify-transportpce-master/. 17:43:24 + source /home/jenkins/lf-env.sh 17:43:24 + lf-activate-venv --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 17:43:24 ++ mktemp -d /tmp/venv-XXXX 17:43:24 + lf_venv=/tmp/venv-Lymm 17:43:24 + local venv_file=/tmp/.os_lf_venv 17:43:24 + local python=python3 17:43:24 + local options 17:43:24 + local set_path=true 17:43:24 + local install_args= 17:43:24 ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 17:43:24 + options=' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' 17:43:24 + eval set -- ' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' 17:43:24 ++ set -- --venv-file /tmp/.toxenv -- tox virtualenv urllib3~=1.26.15 17:43:24 + true 17:43:24 + case $1 in 17:43:24 + venv_file=/tmp/.toxenv 17:43:24 + shift 2 17:43:24 + true 17:43:24 + case $1 in 17:43:24 + shift 17:43:24 + break 17:43:24 + case $python in 17:43:24 + local pkg_list= 17:43:24 + [[ -d /opt/pyenv ]] 17:43:24 + echo 'Setup pyenv:' 17:43:24 Setup pyenv: 17:43:24 + export PYENV_ROOT=/opt/pyenv 17:43:24 + PYENV_ROOT=/opt/pyenv 17:43:24 + export PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 17:43:24 + PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 17:43:24 + pyenv versions 17:43:24 system 17:43:24 3.8.20 17:43:24 3.9.20 17:43:24 3.10.15 17:43:24 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 17:43:24 + command -v pyenv 17:43:24 ++ pyenv init - --no-rehash 17:43:24 + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); 17:43:24 for i in ${!paths[@]}; do 17:43:24 if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; 17:43:24 fi; done; 17:43:24 echo "${paths[*]}"'\'')" 17:43:24 export PATH="/opt/pyenv/shims:${PATH}" 17:43:24 export PYENV_SHELL=bash 17:43:24 source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' 17:43:24 pyenv() { 17:43:24 local command 17:43:24 command="${1:-}" 17:43:24 if [ "$#" -gt 0 ]; then 17:43:24 shift 17:43:24 fi 17:43:24 17:43:24 case "$command" in 17:43:24 rehash|shell) 17:43:24 eval "$(pyenv "sh-$command" "$@")" 17:43:24 ;; 17:43:24 *) 17:43:24 command pyenv "$command" "$@" 17:43:24 ;; 17:43:24 esac 17:43:24 }' 17:43:24 +++ bash --norc -ec 'IFS=:; paths=($PATH); 17:43:24 for i in ${!paths[@]}; do 17:43:24 if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; 17:43:24 fi; done; 17:43:24 echo "${paths[*]}"' 17:43:24 ++ PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 17:43:24 ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 17:43:24 ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 17:43:24 ++ export PYENV_SHELL=bash 17:43:24 ++ PYENV_SHELL=bash 17:43:24 ++ source /opt/pyenv/libexec/../completions/pyenv.bash 17:43:24 +++ complete -F _pyenv pyenv 17:43:24 ++ lf-pyver python3 17:43:24 ++ local py_version_xy=python3 17:43:24 ++ local py_version_xyz= 17:43:24 ++ pyenv versions 17:43:24 ++ local command 17:43:24 ++ sed 's/^[ *]* //' 17:43:24 ++ command=versions 17:43:24 ++ '[' 1 -gt 0 ']' 17:43:24 ++ shift 17:43:24 ++ case "$command" in 17:43:24 ++ command pyenv versions 17:43:24 ++ awk '{ print $1 }' 17:43:24 ++ grep -E '^[0-9.]*[0-9]$' 17:43:24 ++ [[ ! -s /tmp/.pyenv_versions ]] 17:43:24 +++ grep '^3' /tmp/.pyenv_versions 17:43:24 +++ sort -V 17:43:24 +++ tail -n 1 17:43:24 ++ py_version_xyz=3.11.10 17:43:24 ++ [[ -z 3.11.10 ]] 17:43:24 ++ echo 3.11.10 17:43:24 ++ return 0 17:43:24 + pyenv local 3.11.10 17:43:24 + local command 17:43:24 + command=local 17:43:24 + '[' 2 -gt 0 ']' 17:43:24 + shift 17:43:24 + case "$command" in 17:43:24 + command pyenv local 3.11.10 17:43:24 + for arg in "$@" 17:43:24 + case $arg in 17:43:24 + pkg_list+='tox ' 17:43:24 + for arg in "$@" 17:43:24 + case $arg in 17:43:24 + pkg_list+='virtualenv ' 17:43:24 + for arg in "$@" 17:43:24 + case $arg in 17:43:24 + pkg_list+='urllib3~=1.26.15 ' 17:43:24 + [[ -f /tmp/.toxenv ]] 17:43:24 ++ cat /tmp/.toxenv 17:43:24 + lf_venv=/tmp/venv-Xpbr 17:43:24 + echo 'lf-activate-venv(): INFO: Reuse venv:/tmp/venv-Xpbr from' file:/tmp/.toxenv 17:43:24 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-Xpbr from file:/tmp/.toxenv 17:43:24 + echo 'lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv)' 17:43:24 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 17:43:24 + local 'pip_opts=--upgrade --quiet' 17:43:24 + pip_opts='--upgrade --quiet --trusted-host pypi.org' 17:43:24 + pip_opts='--upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org' 17:43:24 + pip_opts='--upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org' 17:43:24 + [[ -n '' ]] 17:43:24 + [[ -n '' ]] 17:43:24 + echo 'lf-activate-venv(): INFO: Attempting to install with network-safe options...' 17:43:24 lf-activate-venv(): INFO: Attempting to install with network-safe options... 17:43:24 + /tmp/venv-Xpbr/bin/python3 -m pip install --upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org pip 'setuptools<66' virtualenv 17:43:25 + echo 'lf-activate-venv(): INFO: Base packages installed successfully' 17:43:25 lf-activate-venv(): INFO: Base packages installed successfully 17:43:25 + [[ -z tox virtualenv urllib3~=1.26.15 ]] 17:43:25 + echo 'lf-activate-venv(): INFO: Installing additional packages: tox virtualenv urllib3~=1.26.15 ' 17:43:25 lf-activate-venv(): INFO: Installing additional packages: tox virtualenv urllib3~=1.26.15 17:43:25 + /tmp/venv-Xpbr/bin/python3 -m pip install --upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org --upgrade-strategy eager tox virtualenv urllib3~=1.26.15 17:43:27 + type python3 17:43:27 + true 17:43:27 + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-Xpbr/bin to PATH' 17:43:27 lf-activate-venv(): INFO: Adding /tmp/venv-Xpbr/bin to PATH 17:43:27 + PATH=/tmp/venv-Xpbr/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 17:43:27 + return 0 17:43:27 + [[ -d /opt/pyenv ]] 17:43:27 + echo '---> Setting up pyenv' 17:43:27 ---> Setting up pyenv 17:43:27 + export PYENV_ROOT=/opt/pyenv 17:43:27 + PYENV_ROOT=/opt/pyenv 17:43:27 + export PATH=/opt/pyenv/bin:/tmp/venv-Xpbr/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 17:43:27 + PATH=/opt/pyenv/bin:/tmp/venv-Xpbr/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 17:43:27 ++ pwd 17:43:27 + PYTHONPATH=/w/workspace/transportpce-tox-verify-transportpce-master 17:43:27 + export PYTHONPATH 17:43:27 + export TOX_TESTENV_PASSENV=PYTHONPATH 17:43:27 + TOX_TESTENV_PASSENV=PYTHONPATH 17:43:27 + tox --version 17:43:27 4.34.1 from /tmp/venv-Xpbr/lib/python3.11/site-packages/tox/__init__.py 17:43:27 + PARALLEL=True 17:43:27 + TOX_OPTIONS_LIST= 17:43:27 + [[ -n '' ]] 17:43:27 + case ${PARALLEL,,} in 17:43:27 + TOX_OPTIONS_LIST=' --parallel auto --parallel-live' 17:43:27 + tox --parallel auto --parallel-live 17:43:27 + tee -a /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tox.log 17:43:29 buildcontroller: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 17:43:29 docs-linkcheck: install_deps> python -I -m pip install -r docs/requirements.txt 17:43:29 docs: install_deps> python -I -m pip install -r docs/requirements.txt 17:43:29 checkbashisms: freeze> python -m pip freeze --all 17:43:29 checkbashisms: pip==25.3,setuptools==80.9.0 17:43:29 checkbashisms: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./fixCIcentOS8reposMirrors.sh 17:43:29 checkbashisms: commands[1] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sh -c 'command checkbashisms>/dev/null || sudo yum install -y devscripts-checkbashisms || sudo yum install -y devscripts-minimal || sudo yum install -y devscripts || sudo yum install -y https://archives.fedoraproject.org/pub/archive/fedora/linux/releases/31/Everything/x86_64/os/Packages/d/devscripts-checkbashisms-2.19.6-2.fc31.x86_64.rpm || (echo "checkbashisms command not found - please install it (e.g. sudo apt-get install devscripts | yum install devscripts-minimal )" >&2 && exit 1)' 17:43:29 checkbashisms: commands[2] /w/workspace/transportpce-tox-verify-transportpce-master/tests> find . -not -path '*/\.*' -name '*.sh' -exec checkbashisms -f '{}' + 17:43:31 checkbashisms: OK ✔ in 3.38 seconds 17:43:31 pre-commit: install_deps> python -I -m pip install pre-commit 17:43:33 pre-commit: freeze> python -m pip freeze --all 17:43:34 pre-commit: cfgv==3.5.0,distlib==0.4.0,filelock==3.20.3,identify==2.6.16,nodeenv==1.10.0,pip==25.3,platformdirs==4.5.1,pre_commit==4.5.1,PyYAML==6.0.3,setuptools==80.9.0,virtualenv==20.36.1 17:43:34 pre-commit: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./fixCIcentOS8reposMirrors.sh 17:43:34 pre-commit: commands[1] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sh -c 'which cpan || sudo yum install -y perl-CPAN || (echo "cpan command not found - please install it (e.g. sudo apt-get install perl-modules | yum install perl-CPAN )" >&2 && exit 1)' 17:43:34 /usr/bin/cpan 17:43:34 pre-commit: commands[2] /w/workspace/transportpce-tox-verify-transportpce-master/tests> pre-commit run --all-files --show-diff-on-failure 17:43:34 [WARNING] hook id `remove-tabs` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. 17:43:34 [WARNING] hook id `perltidy` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. 17:43:34 [INFO] Initializing environment for https://github.com/pre-commit/pre-commit-hooks. 17:43:34 [WARNING] repo `https://github.com/pre-commit/pre-commit-hooks` uses deprecated stage names (commit, push) which will be removed in a future version. Hint: often `pre-commit autoupdate --repo https://github.com/pre-commit/pre-commit-hooks` will fix this. if it does not -- consider reporting an issue to that repo. 17:43:34 [INFO] Initializing environment for https://github.com/jorisroovers/gitlint. 17:43:35 [INFO] Initializing environment for https://github.com/jorisroovers/gitlint:./gitlint-core[trusted-deps]. 17:43:35 [INFO] Initializing environment for https://github.com/Lucas-C/pre-commit-hooks. 17:43:35 [INFO] Initializing environment for https://github.com/pre-commit/mirrors-autopep8. 17:43:36 [INFO] Initializing environment for https://github.com/perltidy/perltidy. 17:43:36 buildcontroller: freeze> python -m pip freeze --all 17:43:36 buildcontroller: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.2.1,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==80.9.0,urllib3==2.6.3 17:43:36 buildcontroller: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_controller.sh 17:43:36 + update-java-alternatives -l 17:43:36 java-1.11.0-openjdk-amd64 1111 /usr/lib/jvm/java-1.11.0-openjdk-amd64 17:43:36 java-1.17.0-openjdk-amd64 1711 /usr/lib/jvm/java-1.17.0-openjdk-amd64 17:43:36 java-1.21.0-openjdk-amd64 2111 /usr/lib/jvm/java-1.21.0-openjdk-amd64 17:43:36 + sudo update-java-alternatives -s java-1.21.0-openjdk-amd64 17:43:36 update-alternatives: error: no alternatives for jaotc 17:43:36 update-alternatives: error: no alternatives for rmic 17:43:36 + sed -n ;s/.* version "\(.*\)\.\(.*\)\..*".*$/\1/p; 17:43:36 + java -version 17:43:36 + JAVA_VER=21 17:43:36 + echo 21 17:43:36 21 17:43:36 + javac -version 17:43:36 + sed -n ;s/javac \(.*\)\.\(.*\)\..*.*$/\1/p; 17:43:36 [INFO] Installing environment for https://github.com/pre-commit/pre-commit-hooks. 17:43:36 [INFO] Once installed this environment will be reused. 17:43:36 [INFO] This may take a few minutes... 17:43:37 + JAVAC_VER=21 17:43:37 + echo 21 17:43:37 21 17:43:37 ok, java is 21 or newer 17:43:37 + [ 21 -ge 21 ] 17:43:37 + [ 21 -ge 21 ] 17:43:37 + echo ok, java is 21 or newer 17:43:37 + wget -nv https://dlcdn.apache.org/maven/maven-3/3.9.12/binaries/apache-maven-3.9.12-bin.tar.gz -P /tmp 17:43:38 2026-01-14 17:43:38 URL:https://dlcdn.apache.org/maven/maven-3/3.9.12/binaries/apache-maven-3.9.12-bin.tar.gz [9233336/9233336] -> "/tmp/apache-maven-3.9.12-bin.tar.gz" [1] 17:43:38 + sudo mkdir -p /opt 17:43:38 + sudo tar xf /tmp/apache-maven-3.9.12-bin.tar.gz -C /opt 17:43:38 + sudo ln -s /opt/apache-maven-3.9.12 /opt/maven 17:43:38 + sudo ln -s /opt/maven/bin/mvn /usr/bin/mvn 17:43:38 + mvn --version 17:43:38 Apache Maven 3.9.12 (848fbb4bf2d427b72bdb2471c22fced7ebd9a7a1) 17:43:38 Maven home: /opt/maven 17:43:38 Java version: 21.0.9, vendor: Ubuntu, runtime: /usr/lib/jvm/java-21-openjdk-amd64 17:43:38 Default locale: en, platform encoding: UTF-8 17:43:38 OS name: "linux", version: "5.15.0-164-generic", arch: "amd64", family: "unix" 17:43:38 NOTE: Picked up JDK_JAVA_OPTIONS: 17:43:38 --add-opens=java.base/java.io=ALL-UNNAMED 17:43:38 --add-opens=java.base/java.lang=ALL-UNNAMED 17:43:38 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 17:43:38 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 17:43:38 --add-opens=java.base/java.net=ALL-UNNAMED 17:43:38 --add-opens=java.base/java.nio=ALL-UNNAMED 17:43:38 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 17:43:38 --add-opens=java.base/java.nio.file=ALL-UNNAMED 17:43:38 --add-opens=java.base/java.util=ALL-UNNAMED 17:43:38 --add-opens=java.base/java.util.jar=ALL-UNNAMED 17:43:38 --add-opens=java.base/java.util.stream=ALL-UNNAMED 17:43:38 --add-opens=java.base/java.util.zip=ALL-UNNAMED 17:43:38 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 17:43:38 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 17:43:38 -Xlog:disable 17:43:41 [INFO] Installing environment for https://github.com/Lucas-C/pre-commit-hooks. 17:43:41 [INFO] Once installed this environment will be reused. 17:43:41 [INFO] This may take a few minutes... 17:43:49 [INFO] Installing environment for https://github.com/pre-commit/mirrors-autopep8. 17:43:49 [INFO] Once installed this environment will be reused. 17:43:49 [INFO] This may take a few minutes... 17:43:53 [INFO] Installing environment for https://github.com/perltidy/perltidy. 17:43:53 [INFO] Once installed this environment will be reused. 17:43:53 [INFO] This may take a few minutes... 17:43:57 docs-linkcheck: freeze> python -m pip freeze --all 17:43:58 docs-linkcheck: alabaster==1.0.0,attrs==25.4.0,babel==2.17.0,blockdiag==3.0.0,certifi==2026.1.4,charset-normalizer==3.4.4,contourpy==1.3.3,cycler==0.12.1,docutils==0.22.4,fonttools==4.61.1,funcparserlib==2.0.0a0,future==1.0.0,idna==3.11,imagesize==1.4.1,Jinja2==3.1.6,jsonschema==3.2.0,kiwisolver==1.4.9,lfdocs-conf==0.9.0,MarkupSafe==3.0.3,matplotlib==3.10.8,numpy==2.4.1,nwdiag==3.0.0,packaging==25.0,pillow==12.1.0,pip==25.3,Pygments==2.19.2,pyparsing==3.3.1,pyrsistent==0.20.0,python-dateutil==2.9.0.post0,PyYAML==6.0.3,requests==2.32.5,requests-file==1.5.1,roman-numerals==4.1.0,seqdiag==3.0.0,setuptools==80.9.0,six==1.17.0,snowballstemmer==3.0.1,Sphinx==9.0.4,sphinx-bootstrap-theme==0.8.1,sphinx-data-viewer==0.1.5,sphinx-tabs==3.4.7,sphinx_rtd_theme==3.1.0,sphinxcontrib-applehelp==2.0.0,sphinxcontrib-blockdiag==3.0.0,sphinxcontrib-devhelp==2.0.0,sphinxcontrib-htmlhelp==2.1.0,sphinxcontrib-jquery==4.1,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-needs==0.7.9,sphinxcontrib-nwdiag==2.0.0,sphinxcontrib-plantuml==0.31,sphinxcontrib-qthelp==2.0.0,sphinxcontrib-seqdiag==3.0.0,sphinxcontrib-serializinghtml==2.0.0,sphinxcontrib-swaggerdoc==0.1.7,urllib3==2.6.3,webcolors==25.10.0 17:43:58 docs-linkcheck: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sphinx-build -q -b linkcheck -d /w/workspace/transportpce-tox-verify-transportpce-master/.tox/docs-linkcheck/tmp/doctrees ../docs/ /w/workspace/transportpce-tox-verify-transportpce-master/docs/_build/linkcheck 17:43:58 docs: freeze> python -m pip freeze --all 17:43:59 docs: alabaster==1.0.0,attrs==25.4.0,babel==2.17.0,blockdiag==3.0.0,certifi==2026.1.4,charset-normalizer==3.4.4,contourpy==1.3.3,cycler==0.12.1,docutils==0.22.4,fonttools==4.61.1,funcparserlib==2.0.0a0,future==1.0.0,idna==3.11,imagesize==1.4.1,Jinja2==3.1.6,jsonschema==3.2.0,kiwisolver==1.4.9,lfdocs-conf==0.9.0,MarkupSafe==3.0.3,matplotlib==3.10.8,numpy==2.4.1,nwdiag==3.0.0,packaging==25.0,pillow==12.1.0,pip==25.3,Pygments==2.19.2,pyparsing==3.3.1,pyrsistent==0.20.0,python-dateutil==2.9.0.post0,PyYAML==6.0.3,requests==2.32.5,requests-file==1.5.1,roman-numerals==4.1.0,seqdiag==3.0.0,setuptools==80.9.0,six==1.17.0,snowballstemmer==3.0.1,Sphinx==9.0.4,sphinx-bootstrap-theme==0.8.1,sphinx-data-viewer==0.1.5,sphinx-tabs==3.4.7,sphinx_rtd_theme==3.1.0,sphinxcontrib-applehelp==2.0.0,sphinxcontrib-blockdiag==3.0.0,sphinxcontrib-devhelp==2.0.0,sphinxcontrib-htmlhelp==2.1.0,sphinxcontrib-jquery==4.1,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-needs==0.7.9,sphinxcontrib-nwdiag==2.0.0,sphinxcontrib-plantuml==0.31,sphinxcontrib-qthelp==2.0.0,sphinxcontrib-seqdiag==3.0.0,sphinxcontrib-serializinghtml==2.0.0,sphinxcontrib-swaggerdoc==0.1.7,urllib3==2.6.3,webcolors==25.10.0 17:43:59 docs: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sphinx-build -q -W --keep-going -b html -n -d /w/workspace/transportpce-tox-verify-transportpce-master/.tox/docs/tmp/doctrees ../docs/ /w/workspace/transportpce-tox-verify-transportpce-master/docs/_build/html 17:44:02 docs: OK ✔ in 34.48 seconds 17:44:02 pylint: install_deps> python -I -m pip install 'pylint>=2.6.0' 17:44:06 trim trailing whitespace.................................................Passed 17:44:06 Tabs remover.............................................................Passed 17:44:06 docs-linkcheck: OK ✔ in 36.23 seconds 17:44:06 pylint: freeze> python -m pip freeze --all 17:44:06 autopep8.................................................................pylint: astroid==4.0.3,dill==0.4.0,isort==7.0.0,mccabe==0.7.0,pip==25.3,platformdirs==4.5.1,pylint==4.0.4,setuptools==80.9.0,tomlkit==0.14.0 17:44:07 pylint: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> find transportpce_tests/ -name '*.py' -exec pylint --fail-under=10 --max-line-length=120 --disable=missing-docstring,import-error --disable=fixme --disable=duplicate-code '--module-rgx=([a-z0-9_]+$)|([0-9.]{1,30}$)' '--method-rgx=(([a-z_][a-zA-Z0-9_]{2,})|(_[a-z0-9_]*)|(__[a-zA-Z][a-zA-Z0-9_]+__))$' '--variable-rgx=[a-zA-Z_][a-zA-Z0-9_]{1,30}$' '{}' + 17:44:12 Passed 17:44:12 perltidy.................................................................Passed 17:44:13 pre-commit: commands[3] /w/workspace/transportpce-tox-verify-transportpce-master/tests> pre-commit run gitlint-ci --hook-stage manual 17:44:13 [WARNING] hook id `remove-tabs` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. 17:44:13 [WARNING] hook id `perltidy` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. 17:44:13 [INFO] Installing environment for https://github.com/jorisroovers/gitlint. 17:44:13 [INFO] Once installed this environment will be reused. 17:44:13 [INFO] This may take a few minutes... 17:44:22 gitlint..................................................................Passed 17:44:31 17:44:31 ------------------------------------ 17:44:31 Your code has been rated at 10.00/10 17:44:31 17:45:22 pre-commit: OK ✔ in 51.14 seconds 17:45:22 pylint: OK ✔ in 30.69 seconds 17:45:22 buildcontroller: OK ✔ in 1 minute 53.73 seconds 17:45:22 build_karaf_tests221: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 17:45:22 build_karaf_tests190: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 17:45:22 build_karaf_tests121: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 17:45:22 build_karaf_tests71: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 17:45:28 build_karaf_tests71: freeze> python -m pip freeze --all 17:45:28 build_karaf_tests221: freeze> python -m pip freeze --all 17:45:28 build_karaf_tests121: freeze> python -m pip freeze --all 17:45:29 build_karaf_tests190: freeze> python -m pip freeze --all 17:45:29 build_karaf_tests71: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.2.1,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==80.9.0,urllib3==2.6.3 17:45:29 build_karaf_tests71: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh 17:45:29 build karaf in karaf71 with ./karaf71.env 17:45:29 build_karaf_tests221: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.2.1,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==80.9.0,urllib3==2.6.3 17:45:29 build_karaf_tests221: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh 17:45:29 build karaf in karaf221 with ./karaf221.env 17:45:29 build_karaf_tests121: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.2.1,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==80.9.0,urllib3==2.6.3 17:45:29 build_karaf_tests121: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh 17:45:29 build karaf in karaf121 with ./karaf121.env 17:45:29 NOTE: Picked up JDK_JAVA_OPTIONS: 17:45:29 --add-opens=java.base/java.io=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.lang=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.net=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.nio=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.nio.file=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.util=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.util.jar=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.util.stream=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.util.zip=ALL-UNNAMED 17:45:29 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 17:45:29 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 17:45:29 -Xlog:disable 17:45:29 build_karaf_tests190: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.2.1,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==80.9.0,urllib3==2.6.3 17:45:29 build_karaf_tests190: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh 17:45:29 build karaf in karafoc with ./karafoc.env 17:45:29 NOTE: Picked up JDK_JAVA_OPTIONS: 17:45:29 --add-opens=java.base/java.io=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.lang=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.net=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.nio=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.nio.file=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.util=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.util.jar=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.util.stream=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.util.zip=ALL-UNNAMED 17:45:29 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 17:45:29 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 17:45:29 -Xlog:disable 17:45:29 NOTE: Picked up JDK_JAVA_OPTIONS: 17:45:29 --add-opens=java.base/java.io=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.lang=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.net=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.nio=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.nio.file=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.util=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.util.jar=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.util.stream=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.util.zip=ALL-UNNAMED 17:45:29 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 17:45:29 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 17:45:29 -Xlog:disable 17:45:29 NOTE: Picked up JDK_JAVA_OPTIONS: 17:45:29 --add-opens=java.base/java.io=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.lang=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.net=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.nio=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.nio.file=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.util=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.util.jar=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.util.stream=ALL-UNNAMED 17:45:29 --add-opens=java.base/java.util.zip=ALL-UNNAMED 17:45:29 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 17:45:29 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 17:45:29 -Xlog:disable 17:46:25 build_karaf_tests121: OK ✔ in 1 minute 4.46 seconds 17:46:25 buildlighty: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 17:46:28 [ERROR] Failed to execute goal org.apache.maven.plugins:maven-install-plugin:3.1.4:install (default-install) on project transportpce-karaf: Failed to install metadata org.opendaylight.transportpce:transportpce-karaf:13.0.0-SNAPSHOT/maven-metadata.xml: Could not read metadata /home/jenkins/.m2/repository/org/opendaylight/transportpce/transportpce-karaf/13.0.0-SNAPSHOT/maven-metadata-local.xml: input contained no data -> [Help 1] 17:46:28 [ERROR] 17:46:28 [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. 17:46:28 [ERROR] Re-run Maven using the -X switch to enable full debug logging. 17:46:28 [ERROR] 17:46:28 [ERROR] For more information about the errors and possible solutions, please read the following articles: 17:46:28 [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException 17:46:29 build_karaf_tests71: OK ✔ in 1 minute 7.46 seconds 17:46:29 build_karaf_tests221: OK ✔ in 1 minute 7.49 seconds 17:46:29 build_karaf_tests190: OK ✔ in 1 minute 7.54 seconds 17:46:29 sims: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 17:46:29 testsPCE: install_deps> python -I -m pip install gnpy4tpce==2.4.7 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 17:46:37 buildlighty: freeze> python -m pip freeze --all 17:46:37 sims: freeze> python -m pip freeze --all 17:46:37 buildlighty: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.2.1,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==80.9.0,urllib3==2.6.3 17:46:37 buildlighty: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/lighty> ./build.sh 17:46:37 NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED 17:46:37 sims: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.2.1,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==80.9.0,urllib3==2.6.3 17:46:37 sims: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./install_lightynode.sh 17:46:37 Using lighynode version 22.1.0.1 17:46:37 Installing lightynode device to ./lightynode/lightynode-openroadm-device directory 17:47:21 sims: OK ✔ in 12.84 seconds 17:47:21 buildlighty: OK ✔ in 35.74 seconds 17:47:21 testsPCE: freeze> python -m pip freeze --all 17:47:22 testsPCE: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,click==8.3.1,contourpy==1.3.3,cryptography==3.3.2,cycler==0.12.1,dict2xml==1.7.7,Flask==2.1.3,Flask-Injector==0.14.0,fonttools==4.61.1,gnpy4tpce==2.4.7,idna==3.11,iniconfig==2.3.0,injector==0.24.0,invoke==2.2.1,itsdangerous==2.2.0,Jinja2==3.1.6,kiwisolver==1.4.9,lxml==6.0.2,MarkupSafe==3.0.3,matplotlib==3.10.8,netconf-client==3.5.0,networkx==2.8.8,numpy==1.26.4,packaging==25.0,pandas==1.5.3,paramiko==4.0.0,pbr==5.11.1,pillow==12.1.0,pip==25.3,pluggy==1.6.0,psutil==7.2.1,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.2,pyparsing==3.3.1,pytest==9.0.2,python-dateutil==2.9.0.post0,pytz==2025.2,requests==2.32.5,scipy==1.17.0,setuptools==50.3.2,six==1.17.0,urllib3==2.6.3,Werkzeug==2.0.3,xlrd==1.2.0 17:47:22 testsPCE: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh pce 17:47:22 pytest -q transportpce_tests/pce/test01_pce.py 17:48:10 .................... [100%] 17:49:14 20 passed in 111.76s (0:01:51) 17:49:14 pytest -q transportpce_tests/pce/test02_pce_400G.py 17:49:30 ............ [100%] 17:50:01 12 passed in 46.50s 17:50:01 pytest -q transportpce_tests/pce/test03_gnpy.py 17:50:18 ........ [100%] 17:50:39 8 passed in 37.96s 17:50:39 pytest -q transportpce_tests/pce/test04_pce_bug_fix.py 17:51:10 ... [100%] 17:51:15 3 passed in 36.08s 17:51:16 testsPCE: OK ✔ in 4 minutes 46.83 seconds 17:51:16 tests190: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 17:51:16 tests_tapi: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 17:51:16 tests121: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 17:51:23 tests190: freeze> python -m pip freeze --all 17:51:23 tests121: freeze> python -m pip freeze --all 17:51:23 tests_tapi: freeze> python -m pip freeze --all 17:51:23 tests190: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.2.1,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==80.9.0,urllib3==2.6.3 17:51:23 tests190: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh oc 17:51:23 using environment variables from ./karafoc.env 17:51:23 pytest -q transportpce_tests/oc/test01_portmapping.py 17:51:24 tests121: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.2.1,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==80.9.0,urllib3==2.6.3 17:51:24 tests121: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 1.2.1 17:51:24 using environment variables from ./karaf121.env 17:51:24 pytest -q transportpce_tests/1.2.1/test01_portmapping.py 17:51:24 tests_tapi: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.2.1,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==80.9.0,urllib3==2.6.3 17:51:24 tests_tapi: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh tapi 17:51:24 using environment variables from ./karaf221.env 17:51:24 pytest -q transportpce_tests/tapi/test01_abstracted_topology.py 17:52:25 ........... [100%] 17:52:41 10 passed in 76.92s (0:01:16) 17:52:41 pytest -q transportpce_tests/oc/test02_topology.py 17:52:49 ..................................... [100%] 17:53:29 21 passed in 125.49s (0:02:05) 17:53:29 pytest -q transportpce_tests/1.2.1/test02_topo_portmapping.py 17:53:31 ....................... [100%] 17:53:47 14 passed in 65.97s (0:01:05) 17:53:47 pytest -q transportpce_tests/oc/test03_renderer.py 17:53:49 ................... [100%] 17:54:30 6 passed in 60.22s (0:01:00) 17:54:30 pytest -q transportpce_tests/1.2.1/test03_topology.py 17:54:30 .F.....F....F.FF. [100%] 17:54:47 =================================== FAILURES =================================== 17:54:47 _ TestTransportPCERenderer.test_07_service_path_create_network_check_optical_channel _ 17:54:47 17:54:47 self = 17:54:47 17:54:47 def test_07_service_path_create_network_check_optical_channel(self): 17:54:47 response = test_utils_oc.check_node_attribute2_request("XPDR-OC", "component", "cfp2-opt-1-1", 17:54:47 "openconfig-terminal-device:optical-channel") 17:54:47 self.assertEqual(response['status_code'], requests.codes.ok) 17:54:47 optchannel = response['openconfig-terminal-device:optical-channel'] 17:54:47 expected_optchannel = {'frequency': '194100000', 17:54:47 'target-output-power': '0.0', 17:54:47 'operational-mode': 4308 17:54:47 } 17:54:47 self.assertDictEqual( 17:54:47 dict(expected_optchannel, **optchannel['config']), optchannel['config']) 17:54:47 for key, value in expected_optchannel.items(): 17:54:47 self.assertIn(key, optchannel['state']) 17:54:47 > self.assertEqual(optchannel['state'][key], value) 17:54:47 E AssertionError: '192499999' != '194100000' 17:54:47 E - 192499999 17:54:47 E + 194100000 17:54:47 17:54:47 transportpce_tests/oc/test03_renderer.py:128: AssertionError 17:54:47 ----------------------------- Captured stdout call ----------------------------- 17:54:47 execution of test_07_service_path_create_network_check_optical_channel 17:54:47 _ TestTransportPCERenderer.test_10_service_path_delete_network_check_optical_port _ 17:54:47 17:54:47 self = 17:54:47 17:54:47 def test_10_service_path_delete_network_check_optical_port(self): 17:54:47 response = test_utils_oc.check_node_attribute3_request("XPDR-OC", "component", "line-cfp2-1", "port", 17:54:47 "openconfig-transport-line-common:optical-port") 17:54:47 self.assertEqual(response['status_code'], requests.codes.ok) 17:54:47 > self.assertEqual(response['port']['config']['admin-state'], 'DISABLED') 17:54:47 E AssertionError: 'ENABLED' != 'DISABLED' 17:54:47 E - ENABLED 17:54:47 E + DISABLED 17:54:47 17:54:47 transportpce_tests/oc/test03_renderer.py:154: AssertionError 17:54:47 ----------------------------- Captured stdout call ----------------------------- 17:54:47 execution of test_10_service_path_delete_network_check_optical_port 17:54:47 _ TestTransportPCERenderer.test_15_service_path_create_client_check_interfaces _ 17:54:47 17:54:47 self = 17:54:47 17:54:47 def test_15_service_path_create_client_check_interfaces(self): 17:54:47 for interfaceid in TestTransportPCERenderer.interface_id: 17:54:47 response = test_utils_oc.check_interface_attribute_request("XPDR-OC", "interface", interfaceid) 17:54:47 > self.assertEqual(response['interface'][0]['name'], interfaceid) 17:54:47 ^^^^^^^^^^^^^^^^^^^^^^^^ 17:54:47 E KeyError: 0 17:54:47 17:54:47 transportpce_tests/oc/test03_renderer.py:245: KeyError 17:54:47 ----------------------------- Captured stdout call ----------------------------- 17:54:47 execution of test_15_service_path_create_client_check_interfaces 17:54:47 _ TestTransportPCERenderer.test_17_service_path_delete_client_check_optical_port _ 17:54:47 17:54:47 self = 17:54:47 17:54:47 def test_17_service_path_delete_client_check_optical_port(self): 17:54:47 response = test_utils_oc.check_node_attribute3_request("XPDR-OC", "component", 17:54:47 TestTransportPCERenderer.port_id[0], "port", 17:54:47 "openconfig-transport-line-common:optical-port") 17:54:47 self.assertEqual(response['status_code'], requests.codes.ok) 17:54:47 > self.assertEqual(response['port']['config']['admin-state'], 'DISABLED') 17:54:47 E AssertionError: 'ENABLED' != 'DISABLED' 17:54:47 E - ENABLED 17:54:47 E + DISABLED 17:54:47 17:54:47 transportpce_tests/oc/test03_renderer.py:274: AssertionError 17:54:47 ----------------------------- Captured stdout call ----------------------------- 17:54:47 execution of test_17_service_path_delete_client_check_optical_port 17:54:47 _ TestTransportPCERenderer.test_18_service_path_delete_client_check_properties _ 17:54:47 17:54:47 self = 17:54:47 17:54:47 def test_18_service_path_delete_client_check_properties(self): 17:54:47 response = test_utils_oc.check_node_attribute2_request("XPDR-OC", "component", "qsfp-opt-1-4", 17:54:47 "openconfig-platform:properties") 17:54:47 self.assertEqual(response['status_code'], requests.codes.ok) 17:54:47 for prop in response['openconfig-platform:properties']['property']: 17:54:47 self.assertEqual(prop['name'], 'tx-dis') 17:54:47 > self.assertEqual(prop['config'], {'name': 'tx-dis', 'value': 'TRUE'}) 17:54:47 E AssertionError: {'name': 'tx-dis', 'value': 'FALSE'} != {'name': 'tx-dis', 'value': 'TRUE'} 17:54:47 E - {'name': 'tx-dis', 'value': 'FALSE'} 17:54:47 E ? ^^^^ 17:54:47 E 17:54:47 E + {'name': 'tx-dis', 'value': 'TRUE'} 17:54:47 E ? ^^^ 17:54:47 17:54:47 transportpce_tests/oc/test03_renderer.py:284: AssertionError 17:54:47 ----------------------------- Captured stdout call ----------------------------- 17:54:47 execution of test_18_service_path_delete_client_check_properties 17:54:47 =========================== short test summary info ============================ 17:54:47 FAILED transportpce_tests/oc/test03_renderer.py::TestTransportPCERenderer::test_07_service_path_create_network_check_optical_channel 17:54:47 FAILED transportpce_tests/oc/test03_renderer.py::TestTransportPCERenderer::test_10_service_path_delete_network_check_optical_port 17:54:47 FAILED transportpce_tests/oc/test03_renderer.py::TestTransportPCERenderer::test_15_service_path_create_client_check_interfaces 17:54:47 FAILED transportpce_tests/oc/test03_renderer.py::TestTransportPCERenderer::test_17_service_path_delete_client_check_optical_port 17:54:47 FAILED transportpce_tests/oc/test03_renderer.py::TestTransportPCERenderer::test_18_service_path_delete_client_check_properties 17:54:47 5 failed, 14 passed in 58.97s 17:54:47 tests190: exit 1 (203.33 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh oc pid=7339 17:54:47 tests190: FAIL ✖ in 3 minutes 31.56 seconds 17:54:47 tests71: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 17:54:54 .tests71: freeze> python -m pip freeze --all 17:54:55 tests71: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.2.1,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==80.9.0,urllib3==2.6.3 17:54:55 tests71: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 7.1 17:54:55 using environment variables from ./karaf71.env 17:54:55 pytest -q transportpce_tests/7.1/test01_portmapping.py 17:55:14 ............................ [100%] 17:55:49 12 passed in 54.35s 17:55:49 pytest -q transportpce_tests/7.1/test02_otn_renderer.py 17:55:50 ............................F.F.F.F.F.. [100%] 17:57:04 44 passed in 153.86s (0:02:33) 17:57:04 pytest -q transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py 17:57:31 FFF....FFFF...F.....F..F.F.F.F.F.F.F.F.............. [100%] 17:59:03 =================================== FAILURES =================================== 17:59:03 _____________ TestTransportPCERenderer.test_05_service_path_create _____________ 17:59:03 17:59:03 self = 17:59:03 conn = 17:59:03 method = 'POST' 17:59:03 url = '/rests/operations/transportpce-device-renderer:service-path' 17:59:03 body = '{"input": {"service-name": "service_test", "wave-number": "7", "modulation-format": "dp-qpsk", "operation": "create",... 40, "min-freq": 195.775, "max-freq": 195.825, "lower-spectral-slot-number": 713, "higher-spectral-slot-number": 720}}' 17:59:03 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '441', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 17:59:03 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 17:59:03 timeout = Timeout(connect=30, read=30, total=None), chunked = False 17:59:03 response_conn = 17:59:03 preload_content = False, decode_content = False, enforce_content_length = True 17:59:03 17:59:03 def _make_request( 17:59:03 self, 17:59:03 conn: BaseHTTPConnection, 17:59:03 method: str, 17:59:03 url: str, 17:59:03 body: _TYPE_BODY | None = None, 17:59:03 headers: typing.Mapping[str, str] | None = None, 17:59:03 retries: Retry | None = None, 17:59:03 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 17:59:03 chunked: bool = False, 17:59:03 response_conn: BaseHTTPConnection | None = None, 17:59:03 preload_content: bool = True, 17:59:03 decode_content: bool = True, 17:59:03 enforce_content_length: bool = True, 17:59:03 ) -> BaseHTTPResponse: 17:59:03 """ 17:59:03 Perform a request on a given urllib connection object taken from our 17:59:03 pool. 17:59:03 17:59:03 :param conn: 17:59:03 a connection from one of our connection pools 17:59:03 17:59:03 :param method: 17:59:03 HTTP request method (such as GET, POST, PUT, etc.) 17:59:03 17:59:03 :param url: 17:59:03 The URL to perform the request on. 17:59:03 17:59:03 :param body: 17:59:03 Data to send in the request body, either :class:`str`, :class:`bytes`, 17:59:03 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 17:59:03 17:59:03 :param headers: 17:59:03 Dictionary of custom headers to send, such as User-Agent, 17:59:03 If-None-Match, etc. If None, pool headers are used. If provided, 17:59:03 these headers completely replace any pool-specific headers. 17:59:03 17:59:03 :param retries: 17:59:03 Configure the number of retries to allow before raising a 17:59:03 :class:`~urllib3.exceptions.MaxRetryError` exception. 17:59:03 17:59:03 Pass ``None`` to retry until you receive a response. Pass a 17:59:03 :class:`~urllib3.util.retry.Retry` object for fine-grained control 17:59:03 over different types of retries. 17:59:03 Pass an integer number to retry connection errors that many times, 17:59:03 but no other types of errors. Pass zero to never retry. 17:59:03 17:59:03 If ``False``, then retries are disabled and any exception is raised 17:59:03 immediately. Also, instead of raising a MaxRetryError on redirects, 17:59:03 the redirect response will be returned. 17:59:03 17:59:03 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 17:59:03 17:59:03 :param timeout: 17:59:03 If specified, overrides the default timeout for this one 17:59:03 request. It may be a float (in seconds) or an instance of 17:59:03 :class:`urllib3.util.Timeout`. 17:59:03 17:59:03 :param chunked: 17:59:03 If True, urllib3 will send the body using chunked transfer 17:59:03 encoding. Otherwise, urllib3 will send the body using the standard 17:59:03 content-length form. Defaults to False. 17:59:03 17:59:03 :param response_conn: 17:59:03 Set this to ``None`` if you will handle releasing the connection or 17:59:03 set the connection to have the response release it. 17:59:03 17:59:03 :param preload_content: 17:59:03 If True, the response's body will be preloaded during construction. 17:59:03 17:59:03 :param decode_content: 17:59:03 If True, will attempt to decode the body based on the 17:59:03 'content-encoding' header. 17:59:03 17:59:03 :param enforce_content_length: 17:59:03 Enforce content length checking. Body returned by server must match 17:59:03 value of Content-Length header, if present. Otherwise, raise error. 17:59:03 """ 17:59:03 self.num_requests += 1 17:59:03 17:59:03 timeout_obj = self._get_timeout(timeout) 17:59:03 timeout_obj.start_connect() 17:59:03 conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 17:59:03 17:59:03 try: 17:59:03 # Trigger any extra validation we need to do. 17:59:03 try: 17:59:03 self._validate_conn(conn) 17:59:03 except (SocketTimeout, BaseSSLError) as e: 17:59:03 self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 17:59:03 raise 17:59:03 17:59:03 # _validate_conn() starts the connection to an HTTPS proxy 17:59:03 # so we need to wrap errors with 'ProxyError' here too. 17:59:03 except ( 17:59:03 OSError, 17:59:03 NewConnectionError, 17:59:03 TimeoutError, 17:59:03 BaseSSLError, 17:59:03 CertificateError, 17:59:03 SSLError, 17:59:03 ) as e: 17:59:03 new_e: Exception = e 17:59:03 if isinstance(e, (BaseSSLError, CertificateError)): 17:59:03 new_e = SSLError(e) 17:59:03 # If the connection didn't successfully connect to it's proxy 17:59:03 # then there 17:59:03 if isinstance( 17:59:03 new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 17:59:03 ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 17:59:03 new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 17:59:03 raise new_e 17:59:03 17:59:03 # conn.request() calls http.client.*.request, not the method in 17:59:03 # urllib3.request. It also calls makefile (recv) on the socket. 17:59:03 try: 17:59:03 conn.request( 17:59:03 method, 17:59:03 url, 17:59:03 body=body, 17:59:03 headers=headers, 17:59:03 chunked=chunked, 17:59:03 preload_content=preload_content, 17:59:03 decode_content=decode_content, 17:59:03 enforce_content_length=enforce_content_length, 17:59:03 ) 17:59:03 17:59:03 # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 17:59:03 # legitimately able to close the connection after sending a valid response. 17:59:03 # With this behaviour, the received response is still readable. 17:59:03 except BrokenPipeError: 17:59:03 pass 17:59:03 except OSError as e: 17:59:03 # MacOS/Linux 17:59:03 # EPROTOTYPE and ECONNRESET are needed on macOS 17:59:03 # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 17:59:03 # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 17:59:03 if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 17:59:03 raise 17:59:03 17:59:03 # Reset the timeout for the recv() on the socket 17:59:03 read_timeout = timeout_obj.read_timeout 17:59:03 17:59:03 if not conn.is_closed: 17:59:03 # In Python 3 socket.py will catch EAGAIN and return None when you 17:59:03 # try and read into the file pointer created by http.client, which 17:59:03 # instead raises a BadStatusLine exception. Instead of catching 17:59:03 # the exception and assuming all BadStatusLine exceptions are read 17:59:03 # timeouts, check for a zero timeout before making the request. 17:59:03 if read_timeout == 0: 17:59:03 raise ReadTimeoutError( 17:59:03 self, url, f"Read timed out. (read timeout={read_timeout})" 17:59:03 ) 17:59:03 conn.timeout = read_timeout 17:59:03 17:59:03 # Receive the response from the server 17:59:03 try: 17:59:03 > response = conn.getresponse() 17:59:03 ^^^^^^^^^^^^^^^^^^ 17:59:03 17:59:03 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:534: 17:59:03 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 17:59:03 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:571: in getresponse 17:59:03 httplib_response = super().getresponse() 17:59:03 ^^^^^^^^^^^^^^^^^^^^^ 17:59:03 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1395: in getresponse 17:59:03 response.begin() 17:59:03 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:325: in begin 17:59:03 version, status, reason = self._read_status() 17:59:03 ^^^^^^^^^^^^^^^^^^^ 17:59:03 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:286: in _read_status 17:59:03 line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 17:59:03 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 17:59:03 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 17:59:03 17:59:03 self = 17:59:03 b = 17:59:03 17:59:03 def readinto(self, b): 17:59:03 """Read up to len(b) bytes into the writable buffer *b* and return 17:59:03 the number of bytes read. If the socket is non-blocking and no bytes 17:59:03 are available, None is returned. 17:59:03 17:59:03 If *b* is non-empty, a 0 return value indicates that the connection 17:59:03 was shutdown at the other end. 17:59:03 """ 17:59:03 self._checkClosed() 17:59:03 self._checkReadable() 17:59:03 if self._timeout_occurred: 17:59:03 raise OSError("cannot read from timed out object") 17:59:03 while True: 17:59:03 try: 17:59:03 > return self._sock.recv_into(b) 17:59:03 ^^^^^^^^^^^^^^^^^^^^^^^ 17:59:03 E TimeoutError: timed out 17:59:03 17:59:03 /opt/pyenv/versions/3.11.10/lib/python3.11/socket.py:718: TimeoutError 17:59:03 17:59:03 The above exception was the direct cause of the following exception: 17:59:03 17:59:03 self = 17:59:03 request = , stream = False 17:59:03 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 17:59:03 proxies = OrderedDict() 17:59:03 17:59:03 def send( 17:59:03 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 17:59:03 ): 17:59:03 """Sends PreparedRequest object. Returns Response object. 17:59:03 17:59:03 :param request: The :class:`PreparedRequest ` being sent. 17:59:03 :param stream: (optional) Whether to stream the request content. 17:59:03 :param timeout: (optional) How long to wait for the server to send 17:59:03 data before giving up, as a float, or a :ref:`(connect timeout, 17:59:03 read timeout) ` tuple. 17:59:03 :type timeout: float or tuple or urllib3 Timeout object 17:59:03 :param verify: (optional) Either a boolean, in which case it controls whether 17:59:03 we verify the server's TLS certificate, or a string, in which case it 17:59:03 must be a path to a CA bundle to use 17:59:03 :param cert: (optional) Any user-provided SSL certificate to be trusted. 17:59:03 :param proxies: (optional) The proxies dictionary to apply to the request. 17:59:03 :rtype: requests.Response 17:59:03 """ 17:59:03 17:59:03 try: 17:59:03 conn = self.get_connection_with_tls_context( 17:59:03 request, verify, proxies=proxies, cert=cert 17:59:03 ) 17:59:03 except LocationValueError as e: 17:59:03 raise InvalidURL(e, request=request) 17:59:03 17:59:03 self.cert_verify(conn, request.url, verify, cert) 17:59:03 url = self.request_url(request, proxies) 17:59:03 self.add_headers( 17:59:03 request, 17:59:03 stream=stream, 17:59:03 timeout=timeout, 17:59:03 verify=verify, 17:59:03 cert=cert, 17:59:03 proxies=proxies, 17:59:03 ) 17:59:03 17:59:03 chunked = not (request.body is None or "Content-Length" in request.headers) 17:59:03 17:59:03 if isinstance(timeout, tuple): 17:59:03 try: 17:59:03 connect, read = timeout 17:59:03 timeout = TimeoutSauce(connect=connect, read=read) 17:59:03 except ValueError: 17:59:03 raise ValueError( 17:59:03 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 17:59:03 f"or a single float to set both timeouts to the same value." 17:59:03 ) 17:59:03 elif isinstance(timeout, TimeoutSauce): 17:59:03 pass 17:59:03 else: 17:59:03 timeout = TimeoutSauce(connect=timeout, read=timeout) 17:59:03 17:59:03 try: 17:59:03 > resp = conn.urlopen( 17:59:03 method=request.method, 17:59:03 url=url, 17:59:03 body=request.body, 17:59:03 headers=request.headers, 17:59:03 redirect=False, 17:59:03 assert_same_host=False, 17:59:03 preload_content=False, 17:59:03 decode_content=False, 17:59:03 retries=self.max_retries, 17:59:03 timeout=timeout, 17:59:03 chunked=chunked, 17:59:03 ) 17:59:03 17:59:03 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:644: 17:59:03 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 17:59:03 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 17:59:03 retries = retries.increment( 17:59:03 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:490: in increment 17:59:03 raise reraise(type(error), error, _stacktrace) 17:59:03 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 17:59:03 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/util.py:39: in reraise 17:59:03 raise value 17:59:03 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:787: in urlopen 17:59:03 response = self._make_request( 17:59:03 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:536: in _make_request 17:59:03 self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 17:59:03 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 17:59:03 17:59:03 self = 17:59:03 err = TimeoutError('timed out') 17:59:03 url = '/rests/operations/transportpce-device-renderer:service-path' 17:59:03 timeout_value = 30 17:59:03 17:59:03 def _raise_timeout( 17:59:03 self, 17:59:03 err: BaseSSLError | OSError | SocketTimeout, 17:59:03 url: str, 17:59:03 timeout_value: _TYPE_TIMEOUT | None, 17:59:03 ) -> None: 17:59:03 """Is the error actually a timeout? Will raise a ReadTimeout or pass""" 17:59:03 17:59:03 if isinstance(err, SocketTimeout): 17:59:03 > raise ReadTimeoutError( 17:59:03 self, url, f"Read timed out. (read timeout={timeout_value})" 17:59:03 ) from err 17:59:03 E urllib3.exceptions.ReadTimeoutError: HTTPConnectionPool(host='localhost', port=8191): Read timed out. (read timeout=30) 17:59:03 17:59:03 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:367: ReadTimeoutError 17:59:03 17:59:03 During handling of the above exception, another exception occurred: 17:59:03 17:59:03 self = 17:59:03 17:59:03 def test_05_service_path_create(self): 17:59:03 > response = test_utils.transportpce_api_rpc_request( 17:59:03 'transportpce-device-renderer', 'service-path', 17:59:03 { 17:59:03 'service-name': 'service_test', 17:59:03 'wave-number': '7', 17:59:03 'modulation-format': 'dp-qpsk', 17:59:03 'operation': 'create', 17:59:03 'nodes': [{'node-id': 'ROADMA01', 'src-tp': 'SRG1-PP7-TXRX', 'dest-tp': 'DEG1-TTP-TXRX'}, 17:59:03 {'node-id': 'XPDRA01', 'src-tp': 'XPDR1-CLIENT1', 'dest-tp': 'XPDR1-NETWORK1'}], 17:59:03 'center-freq': 195.8, 17:59:03 'nmc-width': 40, 17:59:03 'min-freq': 195.775, 17:59:03 'max-freq': 195.825, 17:59:03 'lower-spectral-slot-number': 713, 17:59:03 'higher-spectral-slot-number': 720 17:59:03 }) 17:59:03 17:59:03 transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py:92: 17:59:03 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 17:59:03 transportpce_tests/common/test_utils.py:751: in transportpce_api_rpc_request 17:59:03 response = post_request(url, data) 17:59:03 ^^^^^^^^^^^^^^^^^^^^^^^ 17:59:03 transportpce_tests/common/test_utils.py:143: in post_request 17:59:03 return requests.request( 17:59:03 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 17:59:03 return session.request(method=method, url=url, **kwargs) 17:59:03 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 17:59:03 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 17:59:03 resp = self.send(prep, **send_kwargs) 17:59:03 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 17:59:03 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 17:59:03 r = adapter.send(request, **kwargs) 17:59:03 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 17:59:03 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 17:59:03 17:59:03 self = 17:59:03 request = , stream = False 17:59:03 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 17:59:03 proxies = OrderedDict() 17:59:03 17:59:03 def send( 17:59:03 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 17:59:03 ): 17:59:03 """Sends PreparedRequest object. Returns Response object. 17:59:03 17:59:03 :param request: The :class:`PreparedRequest ` being sent. 17:59:03 :param stream: (optional) Whether to stream the request content. 17:59:03 :param timeout: (optional) How long to wait for the server to send 17:59:03 data before giving up, as a float, or a :ref:`(connect timeout, 17:59:03 read timeout) ` tuple. 17:59:03 :type timeout: float or tuple or urllib3 Timeout object 17:59:03 :param verify: (optional) Either a boolean, in which case it controls whether 17:59:03 we verify the server's TLS certificate, or a string, in which case it 17:59:03 must be a path to a CA bundle to use 17:59:03 :param cert: (optional) Any user-provided SSL certificate to be trusted. 17:59:03 :param proxies: (optional) The proxies dictionary to apply to the request. 17:59:03 :rtype: requests.Response 17:59:03 """ 17:59:03 17:59:03 try: 17:59:03 conn = self.get_connection_with_tls_context( 17:59:03 request, verify, proxies=proxies, cert=cert 17:59:03 ) 17:59:03 except LocationValueError as e: 17:59:03 raise InvalidURL(e, request=request) 17:59:03 17:59:03 self.cert_verify(conn, request.url, verify, cert) 17:59:03 url = self.request_url(request, proxies) 17:59:03 self.add_headers( 17:59:03 request, 17:59:03 stream=stream, 17:59:03 timeout=timeout, 17:59:03 verify=verify, 17:59:03 cert=cert, 17:59:03 proxies=proxies, 17:59:03 ) 17:59:03 17:59:03 chunked = not (request.body is None or "Content-Length" in request.headers) 17:59:03 17:59:03 if isinstance(timeout, tuple): 17:59:03 try: 17:59:03 connect, read = timeout 17:59:03 timeout = TimeoutSauce(connect=connect, read=read) 17:59:03 except ValueError: 17:59:03 raise ValueError( 17:59:03 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 17:59:03 f"or a single float to set both timeouts to the same value." 17:59:03 ) 17:59:03 elif isinstance(timeout, TimeoutSauce): 17:59:03 pass 17:59:03 else: 17:59:03 timeout = TimeoutSauce(connect=timeout, read=timeout) 17:59:03 17:59:03 try: 17:59:03 resp = conn.urlopen( 17:59:03 method=request.method, 17:59:03 url=url, 17:59:03 body=request.body, 17:59:03 headers=request.headers, 17:59:03 redirect=False, 17:59:03 assert_same_host=False, 17:59:03 preload_content=False, 17:59:03 decode_content=False, 17:59:03 retries=self.max_retries, 17:59:03 timeout=timeout, 17:59:03 chunked=chunked, 17:59:03 ) 17:59:03 17:59:03 except (ProtocolError, OSError) as err: 17:59:03 raise ConnectionError(err, request=request) 17:59:03 17:59:03 except MaxRetryError as e: 17:59:03 if isinstance(e.reason, ConnectTimeoutError): 17:59:03 # TODO: Remove this in 3.0.0: see #2811 17:59:03 if not isinstance(e.reason, NewConnectionError): 17:59:03 raise ConnectTimeout(e, request=request) 17:59:03 17:59:03 if isinstance(e.reason, ResponseError): 17:59:03 raise RetryError(e, request=request) 17:59:03 17:59:03 if isinstance(e.reason, _ProxyError): 17:59:03 raise ProxyError(e, request=request) 17:59:03 17:59:03 if isinstance(e.reason, _SSLError): 17:59:03 # This branch is for urllib3 v1.22 and later. 17:59:03 raise SSLError(e, request=request) 17:59:03 17:59:03 raise ConnectionError(e, request=request) 17:59:03 17:59:03 except ClosedPoolError as e: 17:59:03 raise ConnectionError(e, request=request) 17:59:03 17:59:03 except _ProxyError as e: 17:59:03 raise ProxyError(e) 17:59:03 17:59:03 except (_SSLError, _HTTPError) as e: 17:59:03 if isinstance(e, _SSLError): 17:59:03 # This branch is for urllib3 versions earlier than v1.22 17:59:03 raise SSLError(e, request=request) 17:59:03 elif isinstance(e, ReadTimeoutError): 17:59:03 > raise ReadTimeout(e, request=request) 17:59:03 E requests.exceptions.ReadTimeout: HTTPConnectionPool(host='localhost', port=8191): Read timed out. (read timeout=30) 17:59:03 17:59:03 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:690: ReadTimeout 17:59:03 ----------------------------- Captured stdout call ----------------------------- 17:59:03 execution of test_05_service_path_create 17:59:03 ________ TestTransportPCERenderer.test_06_service_path_create_rdm_check ________ 17:59:03 17:59:03 self = 17:59:03 17:59:03 def test_06_service_path_create_rdm_check(self): 17:59:03 response = test_utils.check_node_attribute_request("ROADMA01", "interface", "DEG1-TTP-TXRX-713:720") 17:59:03 > self.assertEqual(response['status_code'], requests.codes.ok) 17:59:03 E AssertionError: 409 != 200 17:59:03 17:59:03 transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py:113: AssertionError 17:59:03 ----------------------------- Captured stdout call ----------------------------- 17:59:03 execution of test_06_service_path_create_rdm_check 17:59:03 ________ TestTransportPCERenderer.test_07_service_path_create_rdm_check ________ 17:59:03 17:59:03 self = 17:59:03 17:59:03 def test_07_service_path_create_rdm_check(self): 17:59:03 response = test_utils.check_node_attribute_request("ROADMA01", "interface", "SRG1-PP7-TXRX-713:720") 17:59:03 > self.assertEqual(response['status_code'], requests.codes.ok) 17:59:03 E AssertionError: 409 != 200 17:59:03 17:59:03 transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py:129: AssertionError 17:59:03 ----------------------------- Captured stdout call ----------------------------- 17:59:03 execution of test_07_service_path_create_rdm_check 17:59:03 ________ TestTransportPCERenderer.test_08_service_path_create_rdm_check ________ 17:59:03 17:59:03 self = 17:59:03 17:59:03 def test_08_service_path_create_rdm_check(self): 17:59:03 response = test_utils.check_node_attribute_request( 17:59:03 "ROADMA01", "roadm-connections", "SRG1-PP7-TXRX-DEG1-TTP-TXRX-713:720") 17:59:03 > self.assertEqual(response['status_code'], requests.codes.ok) 17:59:03 E AssertionError: 409 != 200 17:59:03 17:59:03 transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py:146: AssertionError 17:59:03 ----------------------------- Captured stdout call ----------------------------- 17:59:03 execution of test_08_service_path_create_rdm_check 17:59:03 _______ TestTransportPCERenderer.test_09_service_path_create_xpdr_check ________ 17:59:03 17:59:03 self = 17:59:03 17:59:03 def test_09_service_path_create_xpdr_check(self): 17:59:03 response = test_utils.check_node_attribute_request("XPDRA01", "interface", "XPDR1-NETWORK1-713:720") 17:59:03 > self.assertEqual(response['status_code'], requests.codes.ok) 17:59:03 E AssertionError: 409 != 200 17:59:03 17:59:03 transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py:159: AssertionError 17:59:03 ----------------------------- Captured stdout call ----------------------------- 17:59:03 execution of test_09_service_path_create_xpdr_check 17:59:03 _______ TestTransportPCERenderer.test_10_service_path_create_xpdr_check ________ 17:59:03 17:59:03 self = 17:59:03 17:59:03 def test_10_service_path_create_xpdr_check(self): 17:59:03 response = test_utils.check_node_attribute_request("XPDRA01", "interface", "XPDR1-NETWORK1-OTU") 17:59:03 > self.assertEqual(response['status_code'], requests.codes.ok) 17:59:03 E AssertionError: 409 != 200 17:59:03 17:59:03 transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py:177: AssertionError 17:59:03 ----------------------------- Captured stdout call ----------------------------- 17:59:03 execution of test_10_service_path_create_xpdr_check 17:59:03 _______ TestTransportPCERenderer.test_11_service_path_create_xpdr_check ________ 17:59:03 17:59:03 self = 17:59:03 17:59:03 def test_11_service_path_create_xpdr_check(self): 17:59:03 response = test_utils.check_node_attribute_request("XPDRA01", "interface", "XPDR1-NETWORK1-ODU") 17:59:03 > self.assertEqual(response['status_code'], requests.codes.ok) 17:59:03 E AssertionError: 409 != 200 17:59:03 17:59:03 transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py:194: AssertionError 17:59:03 ----------------------------- Captured stdout call ----------------------------- 17:59:03 execution of test_11_service_path_create_xpdr_check 17:59:03 _______ TestTransportPCERenderer.test_12_service_path_create_xpdr_check ________ 17:59:03 17:59:03 self = 17:59:03 17:59:03 def test_12_service_path_create_xpdr_check(self): 17:59:03 response = test_utils.check_node_attribute_request("XPDRA01", "interface", "XPDR1-CLIENT1-ETHERNET") 17:59:03 > self.assertEqual(response['status_code'], requests.codes.ok) 17:59:03 E AssertionError: 409 != 200 17:59:03 17:59:03 transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py:217: AssertionError 17:59:03 ----------------------------- Captured stdout call ----------------------------- 17:59:03 execution of test_12_service_path_create_xpdr_check 17:59:03 _______ TestTransportPCERenderer.test_13_service_path_create_xpdr_check ________ 17:59:03 17:59:03 self = 17:59:03 17:59:03 def test_13_service_path_create_xpdr_check(self): 17:59:03 response = test_utils.check_node_attribute_request("XPDRA01", "circuit-packs", "1%2F0%2F1-PLUG-NET") 17:59:03 # FIXME: https://jira.opendaylight.org/browse/TRNSPRTPCE-591 17:59:03 self.assertEqual(response['status_code'], requests.codes.ok) 17:59:03 > self.assertIn('not-reserved-inuse', response['circuit-packs'][0]["equipment-state"]) 17:59:03 E AssertionError: 'not-reserved-inuse' not found in 'not-reserved-available' 17:59:03 17:59:03 transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py:239: AssertionError 17:59:03 ----------------------------- Captured stdout call ----------------------------- 17:59:03 execution of test_13_service_path_create_xpdr_check 17:59:03 =========================== short test summary info ============================ 17:59:03 FAILED transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py::TestTransportPCERenderer::test_05_service_path_create 17:59:03 FAILED transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py::TestTransportPCERenderer::test_06_service_path_create_rdm_check 17:59:03 FAILED transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py::TestTransportPCERenderer::test_07_service_path_create_rdm_check 17:59:03 FAILED transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py::TestTransportPCERenderer::test_08_service_path_create_rdm_check 17:59:03 FAILED transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py::TestTransportPCERenderer::test_09_service_path_create_xpdr_check 17:59:03 FAILED transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py::TestTransportPCERenderer::test_10_service_path_create_xpdr_check 17:59:03 FAILED transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py::TestTransportPCERenderer::test_11_service_path_create_xpdr_check 17:59:03 FAILED transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py::TestTransportPCERenderer::test_12_service_path_create_xpdr_check 17:59:03 FAILED transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py::TestTransportPCERenderer::test_13_service_path_create_xpdr_check 17:59:03 9 failed, 15 passed in 118.22s (0:01:58) 17:59:03 tests121: exit 1 (459.13 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 1.2.1 pid=7350 17:59:15 FFFFF. [100%] 17:59:33 51 passed in 489.25s (0:08:09) 17:59:33 pytest -q transportpce_tests/tapi/test02_full_topology.py 17:59:56 FFF...........FF.FF.F..........FFF.............. [100%] 18:02:00 =================================== FAILURES =================================== 18:02:00 ________ TestTransportPCEOtnRenderer.test_02_service_path_create_otuc2 _________ 18:02:00 18:02:00 self = 18:02:00 conn = 18:02:00 method = 'POST' 18:02:00 url = '/rests/operations/transportpce-device-renderer:service-path' 18:02:00 body = '{"input": {"service-name": "service_OTUC2", "wave-number": "0", "modulation-format": "dp-qpsk", "operation": "create"...75, "min-freq": 196.0375, "max-freq": 196.125, "lower-spectral-slot-number": 755, "higher-spectral-slot-number": 768}}' 18:02:00 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '336', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 18:02:00 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 18:02:00 timeout = Timeout(connect=30, read=30, total=None), chunked = False 18:02:00 response_conn = 18:02:00 preload_content = False, decode_content = False, enforce_content_length = True 18:02:00 18:02:00 def _make_request( 18:02:00 self, 18:02:00 conn: BaseHTTPConnection, 18:02:00 method: str, 18:02:00 url: str, 18:02:00 body: _TYPE_BODY | None = None, 18:02:00 headers: typing.Mapping[str, str] | None = None, 18:02:00 retries: Retry | None = None, 18:02:00 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 18:02:00 chunked: bool = False, 18:02:00 response_conn: BaseHTTPConnection | None = None, 18:02:00 preload_content: bool = True, 18:02:00 decode_content: bool = True, 18:02:00 enforce_content_length: bool = True, 18:02:00 ) -> BaseHTTPResponse: 18:02:00 """ 18:02:00 Perform a request on a given urllib connection object taken from our 18:02:00 pool. 18:02:00 18:02:00 :param conn: 18:02:00 a connection from one of our connection pools 18:02:00 18:02:00 :param method: 18:02:00 HTTP request method (such as GET, POST, PUT, etc.) 18:02:00 18:02:00 :param url: 18:02:00 The URL to perform the request on. 18:02:00 18:02:00 :param body: 18:02:00 Data to send in the request body, either :class:`str`, :class:`bytes`, 18:02:00 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 18:02:00 18:02:00 :param headers: 18:02:00 Dictionary of custom headers to send, such as User-Agent, 18:02:00 If-None-Match, etc. If None, pool headers are used. If provided, 18:02:00 these headers completely replace any pool-specific headers. 18:02:00 18:02:00 :param retries: 18:02:00 Configure the number of retries to allow before raising a 18:02:00 :class:`~urllib3.exceptions.MaxRetryError` exception. 18:02:00 18:02:00 Pass ``None`` to retry until you receive a response. Pass a 18:02:00 :class:`~urllib3.util.retry.Retry` object for fine-grained control 18:02:00 over different types of retries. 18:02:00 Pass an integer number to retry connection errors that many times, 18:02:00 but no other types of errors. Pass zero to never retry. 18:02:00 18:02:00 If ``False``, then retries are disabled and any exception is raised 18:02:00 immediately. Also, instead of raising a MaxRetryError on redirects, 18:02:00 the redirect response will be returned. 18:02:00 18:02:00 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 18:02:00 18:02:00 :param timeout: 18:02:00 If specified, overrides the default timeout for this one 18:02:00 request. It may be a float (in seconds) or an instance of 18:02:00 :class:`urllib3.util.Timeout`. 18:02:00 18:02:00 :param chunked: 18:02:00 If True, urllib3 will send the body using chunked transfer 18:02:00 encoding. Otherwise, urllib3 will send the body using the standard 18:02:00 content-length form. Defaults to False. 18:02:00 18:02:00 :param response_conn: 18:02:00 Set this to ``None`` if you will handle releasing the connection or 18:02:00 set the connection to have the response release it. 18:02:00 18:02:00 :param preload_content: 18:02:00 If True, the response's body will be preloaded during construction. 18:02:00 18:02:00 :param decode_content: 18:02:00 If True, will attempt to decode the body based on the 18:02:00 'content-encoding' header. 18:02:00 18:02:00 :param enforce_content_length: 18:02:00 Enforce content length checking. Body returned by server must match 18:02:00 value of Content-Length header, if present. Otherwise, raise error. 18:02:00 """ 18:02:00 self.num_requests += 1 18:02:00 18:02:00 timeout_obj = self._get_timeout(timeout) 18:02:00 timeout_obj.start_connect() 18:02:00 conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 18:02:00 18:02:00 try: 18:02:00 # Trigger any extra validation we need to do. 18:02:00 try: 18:02:00 self._validate_conn(conn) 18:02:00 except (SocketTimeout, BaseSSLError) as e: 18:02:00 self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 18:02:00 raise 18:02:00 18:02:00 # _validate_conn() starts the connection to an HTTPS proxy 18:02:00 # so we need to wrap errors with 'ProxyError' here too. 18:02:00 except ( 18:02:00 OSError, 18:02:00 NewConnectionError, 18:02:00 TimeoutError, 18:02:00 BaseSSLError, 18:02:00 CertificateError, 18:02:00 SSLError, 18:02:00 ) as e: 18:02:00 new_e: Exception = e 18:02:00 if isinstance(e, (BaseSSLError, CertificateError)): 18:02:00 new_e = SSLError(e) 18:02:00 # If the connection didn't successfully connect to it's proxy 18:02:00 # then there 18:02:00 if isinstance( 18:02:00 new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 18:02:00 ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 18:02:00 new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 18:02:00 raise new_e 18:02:00 18:02:00 # conn.request() calls http.client.*.request, not the method in 18:02:00 # urllib3.request. It also calls makefile (recv) on the socket. 18:02:00 try: 18:02:00 conn.request( 18:02:00 method, 18:02:00 url, 18:02:00 body=body, 18:02:00 headers=headers, 18:02:00 chunked=chunked, 18:02:00 preload_content=preload_content, 18:02:00 decode_content=decode_content, 18:02:00 enforce_content_length=enforce_content_length, 18:02:00 ) 18:02:00 18:02:00 # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 18:02:00 # legitimately able to close the connection after sending a valid response. 18:02:00 # With this behaviour, the received response is still readable. 18:02:00 except BrokenPipeError: 18:02:00 pass 18:02:00 except OSError as e: 18:02:00 # MacOS/Linux 18:02:00 # EPROTOTYPE and ECONNRESET are needed on macOS 18:02:00 # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 18:02:00 # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 18:02:00 if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 18:02:00 raise 18:02:00 18:02:00 # Reset the timeout for the recv() on the socket 18:02:00 read_timeout = timeout_obj.read_timeout 18:02:00 18:02:00 if not conn.is_closed: 18:02:00 # In Python 3 socket.py will catch EAGAIN and return None when you 18:02:00 # try and read into the file pointer created by http.client, which 18:02:00 # instead raises a BadStatusLine exception. Instead of catching 18:02:00 # the exception and assuming all BadStatusLine exceptions are read 18:02:00 # timeouts, check for a zero timeout before making the request. 18:02:00 if read_timeout == 0: 18:02:00 raise ReadTimeoutError( 18:02:00 self, url, f"Read timed out. (read timeout={read_timeout})" 18:02:00 ) 18:02:00 conn.timeout = read_timeout 18:02:00 18:02:00 # Receive the response from the server 18:02:00 try: 18:02:00 > response = conn.getresponse() 18:02:00 ^^^^^^^^^^^^^^^^^^ 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:534: 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connection.py:571: in getresponse 18:02:00 httplib_response = super().getresponse() 18:02:00 ^^^^^^^^^^^^^^^^^^^^^ 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1395: in getresponse 18:02:00 response.begin() 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:325: in begin 18:02:00 version, status, reason = self._read_status() 18:02:00 ^^^^^^^^^^^^^^^^^^^ 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:286: in _read_status 18:02:00 line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 18:02:00 self = 18:02:00 b = 18:02:00 18:02:00 def readinto(self, b): 18:02:00 """Read up to len(b) bytes into the writable buffer *b* and return 18:02:00 the number of bytes read. If the socket is non-blocking and no bytes 18:02:00 are available, None is returned. 18:02:00 18:02:00 If *b* is non-empty, a 0 return value indicates that the connection 18:02:00 was shutdown at the other end. 18:02:00 """ 18:02:00 self._checkClosed() 18:02:00 self._checkReadable() 18:02:00 if self._timeout_occurred: 18:02:00 raise OSError("cannot read from timed out object") 18:02:00 while True: 18:02:00 try: 18:02:00 > return self._sock.recv_into(b) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 E TimeoutError: timed out 18:02:00 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/socket.py:718: TimeoutError 18:02:00 18:02:00 The above exception was the direct cause of the following exception: 18:02:00 18:02:00 self = 18:02:00 request = , stream = False 18:02:00 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 18:02:00 proxies = OrderedDict() 18:02:00 18:02:00 def send( 18:02:00 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 18:02:00 ): 18:02:00 """Sends PreparedRequest object. Returns Response object. 18:02:00 18:02:00 :param request: The :class:`PreparedRequest ` being sent. 18:02:00 :param stream: (optional) Whether to stream the request content. 18:02:00 :param timeout: (optional) How long to wait for the server to send 18:02:00 data before giving up, as a float, or a :ref:`(connect timeout, 18:02:00 read timeout) ` tuple. 18:02:00 :type timeout: float or tuple or urllib3 Timeout object 18:02:00 :param verify: (optional) Either a boolean, in which case it controls whether 18:02:00 we verify the server's TLS certificate, or a string, in which case it 18:02:00 must be a path to a CA bundle to use 18:02:00 :param cert: (optional) Any user-provided SSL certificate to be trusted. 18:02:00 :param proxies: (optional) The proxies dictionary to apply to the request. 18:02:00 :rtype: requests.Response 18:02:00 """ 18:02:00 18:02:00 try: 18:02:00 conn = self.get_connection_with_tls_context( 18:02:00 request, verify, proxies=proxies, cert=cert 18:02:00 ) 18:02:00 except LocationValueError as e: 18:02:00 raise InvalidURL(e, request=request) 18:02:00 18:02:00 self.cert_verify(conn, request.url, verify, cert) 18:02:00 url = self.request_url(request, proxies) 18:02:00 self.add_headers( 18:02:00 request, 18:02:00 stream=stream, 18:02:00 timeout=timeout, 18:02:00 verify=verify, 18:02:00 cert=cert, 18:02:00 proxies=proxies, 18:02:00 ) 18:02:00 18:02:00 chunked = not (request.body is None or "Content-Length" in request.headers) 18:02:00 18:02:00 if isinstance(timeout, tuple): 18:02:00 try: 18:02:00 connect, read = timeout 18:02:00 timeout = TimeoutSauce(connect=connect, read=read) 18:02:00 except ValueError: 18:02:00 raise ValueError( 18:02:00 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 18:02:00 f"or a single float to set both timeouts to the same value." 18:02:00 ) 18:02:00 elif isinstance(timeout, TimeoutSauce): 18:02:00 pass 18:02:00 else: 18:02:00 timeout = TimeoutSauce(connect=timeout, read=timeout) 18:02:00 18:02:00 try: 18:02:00 > resp = conn.urlopen( 18:02:00 method=request.method, 18:02:00 url=url, 18:02:00 body=request.body, 18:02:00 headers=request.headers, 18:02:00 redirect=False, 18:02:00 assert_same_host=False, 18:02:00 preload_content=False, 18:02:00 decode_content=False, 18:02:00 retries=self.max_retries, 18:02:00 timeout=timeout, 18:02:00 chunked=chunked, 18:02:00 ) 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/adapters.py:644: 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 18:02:00 retries = retries.increment( 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/util/retry.py:490: in increment 18:02:00 raise reraise(type(error), error, _stacktrace) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/util/util.py:39: in reraise 18:02:00 raise value 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:787: in urlopen 18:02:00 response = self._make_request( 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:536: in _make_request 18:02:00 self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 18:02:00 self = 18:02:00 err = TimeoutError('timed out') 18:02:00 url = '/rests/operations/transportpce-device-renderer:service-path' 18:02:00 timeout_value = 30 18:02:00 18:02:00 def _raise_timeout( 18:02:00 self, 18:02:00 err: BaseSSLError | OSError | SocketTimeout, 18:02:00 url: str, 18:02:00 timeout_value: _TYPE_TIMEOUT | None, 18:02:00 ) -> None: 18:02:00 """Is the error actually a timeout? Will raise a ReadTimeout or pass""" 18:02:00 18:02:00 if isinstance(err, SocketTimeout): 18:02:00 > raise ReadTimeoutError( 18:02:00 self, url, f"Read timed out. (read timeout={timeout_value})" 18:02:00 ) from err 18:02:00 E urllib3.exceptions.ReadTimeoutError: HTTPConnectionPool(host='localhost', port=8184): Read timed out. (read timeout=30) 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:367: ReadTimeoutError 18:02:00 18:02:00 During handling of the above exception, another exception occurred: 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_02_service_path_create_otuc2(self): 18:02:00 > response = test_utils.transportpce_api_rpc_request( 18:02:00 'transportpce-device-renderer', 'service-path', 18:02:00 { 18:02:00 'service-name': 'service_OTUC2', 18:02:00 'wave-number': '0', 18:02:00 'modulation-format': 'dp-qpsk', 18:02:00 'operation': 'create', 18:02:00 'nodes': [{'node-id': 'XPDR-A2', 'dest-tp': 'XPDR2-NETWORK1'}], 18:02:00 'center-freq': 196.1, 18:02:00 'nmc-width': 75, 18:02:00 'min-freq': 196.0375, 18:02:00 'max-freq': 196.125, 18:02:00 'lower-spectral-slot-number': 755, 18:02:00 'higher-spectral-slot-number': 768 18:02:00 }) 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:74: 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 transportpce_tests/common/test_utils.py:751: in transportpce_api_rpc_request 18:02:00 response = post_request(url, data) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 transportpce_tests/common/test_utils.py:143: in post_request 18:02:00 return requests.request( 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/api.py:59: in request 18:02:00 return session.request(method=method, url=url, **kwargs) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/sessions.py:589: in request 18:02:00 resp = self.send(prep, **send_kwargs) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/sessions.py:703: in send 18:02:00 r = adapter.send(request, **kwargs) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 18:02:00 self = 18:02:00 request = , stream = False 18:02:00 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 18:02:00 proxies = OrderedDict() 18:02:00 18:02:00 def send( 18:02:00 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 18:02:00 ): 18:02:00 """Sends PreparedRequest object. Returns Response object. 18:02:00 18:02:00 :param request: The :class:`PreparedRequest ` being sent. 18:02:00 :param stream: (optional) Whether to stream the request content. 18:02:00 :param timeout: (optional) How long to wait for the server to send 18:02:00 data before giving up, as a float, or a :ref:`(connect timeout, 18:02:00 read timeout) ` tuple. 18:02:00 :type timeout: float or tuple or urllib3 Timeout object 18:02:00 :param verify: (optional) Either a boolean, in which case it controls whether 18:02:00 we verify the server's TLS certificate, or a string, in which case it 18:02:00 must be a path to a CA bundle to use 18:02:00 :param cert: (optional) Any user-provided SSL certificate to be trusted. 18:02:00 :param proxies: (optional) The proxies dictionary to apply to the request. 18:02:00 :rtype: requests.Response 18:02:00 """ 18:02:00 18:02:00 try: 18:02:00 conn = self.get_connection_with_tls_context( 18:02:00 request, verify, proxies=proxies, cert=cert 18:02:00 ) 18:02:00 except LocationValueError as e: 18:02:00 raise InvalidURL(e, request=request) 18:02:00 18:02:00 self.cert_verify(conn, request.url, verify, cert) 18:02:00 url = self.request_url(request, proxies) 18:02:00 self.add_headers( 18:02:00 request, 18:02:00 stream=stream, 18:02:00 timeout=timeout, 18:02:00 verify=verify, 18:02:00 cert=cert, 18:02:00 proxies=proxies, 18:02:00 ) 18:02:00 18:02:00 chunked = not (request.body is None or "Content-Length" in request.headers) 18:02:00 18:02:00 if isinstance(timeout, tuple): 18:02:00 try: 18:02:00 connect, read = timeout 18:02:00 timeout = TimeoutSauce(connect=connect, read=read) 18:02:00 except ValueError: 18:02:00 raise ValueError( 18:02:00 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 18:02:00 f"or a single float to set both timeouts to the same value." 18:02:00 ) 18:02:00 elif isinstance(timeout, TimeoutSauce): 18:02:00 pass 18:02:00 else: 18:02:00 timeout = TimeoutSauce(connect=timeout, read=timeout) 18:02:00 18:02:00 try: 18:02:00 resp = conn.urlopen( 18:02:00 method=request.method, 18:02:00 url=url, 18:02:00 body=request.body, 18:02:00 headers=request.headers, 18:02:00 redirect=False, 18:02:00 assert_same_host=False, 18:02:00 preload_content=False, 18:02:00 decode_content=False, 18:02:00 retries=self.max_retries, 18:02:00 timeout=timeout, 18:02:00 chunked=chunked, 18:02:00 ) 18:02:00 18:02:00 except (ProtocolError, OSError) as err: 18:02:00 raise ConnectionError(err, request=request) 18:02:00 18:02:00 except MaxRetryError as e: 18:02:00 if isinstance(e.reason, ConnectTimeoutError): 18:02:00 # TODO: Remove this in 3.0.0: see #2811 18:02:00 if not isinstance(e.reason, NewConnectionError): 18:02:00 raise ConnectTimeout(e, request=request) 18:02:00 18:02:00 if isinstance(e.reason, ResponseError): 18:02:00 raise RetryError(e, request=request) 18:02:00 18:02:00 if isinstance(e.reason, _ProxyError): 18:02:00 raise ProxyError(e, request=request) 18:02:00 18:02:00 if isinstance(e.reason, _SSLError): 18:02:00 # This branch is for urllib3 v1.22 and later. 18:02:00 raise SSLError(e, request=request) 18:02:00 18:02:00 raise ConnectionError(e, request=request) 18:02:00 18:02:00 except ClosedPoolError as e: 18:02:00 raise ConnectionError(e, request=request) 18:02:00 18:02:00 except _ProxyError as e: 18:02:00 raise ProxyError(e) 18:02:00 18:02:00 except (_SSLError, _HTTPError) as e: 18:02:00 if isinstance(e, _SSLError): 18:02:00 # This branch is for urllib3 versions earlier than v1.22 18:02:00 raise SSLError(e, request=request) 18:02:00 elif isinstance(e, ReadTimeoutError): 18:02:00 > raise ReadTimeout(e, request=request) 18:02:00 E requests.exceptions.ReadTimeout: HTTPConnectionPool(host='localhost', port=8184): Read timed out. (read timeout=30) 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/adapters.py:690: ReadTimeout 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_02_service_path_create_otuc2 18:02:00 _________ TestTransportPCEOtnRenderer.test_03_get_portmapping_network1 _________ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_03_get_portmapping_network1(self): 18:02:00 response = test_utils.get_portmapping_node_attr("XPDR-A2", "mapping", "XPDR2-NETWORK1") 18:02:00 self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 self.NETWORK2_CHECK_DICT["supporting-otucn"] = "XPDR2-NETWORK1-OTUC2" 18:02:00 expected_sorted = test_utils.recursive_sort(self.NETWORK2_CHECK_DICT) 18:02:00 response_sorted = [ 18:02:00 test_utils.recursive_sort(item) for item in response['mapping'] 18:02:00 ] 18:02:00 > self.assertIn(expected_sorted, response_sorted) 18:02:00 E AssertionError: {'lcp-hash-val': 'LY9PxYJqUbw=', 'logical-connection-point': 'XPDR2-NETWORK1', 'port-admin-state': 'InService', 'port-direction': 'bidirectional', 'port-oper-state': 'InService', 'port-qual': 'switch-network', 'rate': '200', 'supported-interface-capability': ['org-openroadm-port-types:if-otsi-otsigroup'], 'supported-operational-mode': ['OR-W-100G-oFEC-31.6Gbd', 'OR-W-200G-oFEC-31.6Gbd'], 'supporting-circuit-pack-name': '1/2/2-PLUG-NET', 'supporting-otucn': 'XPDR2-NETWORK1-OTUC2', 'supporting-port': 'L1', 'xpdr-type': 'mpdr'} not found in [{'lcp-hash-val': 'LY9PxYJqUbw=', 'logical-connection-point': 'XPDR2-NETWORK1', 'port-admin-state': 'InService', 'port-direction': 'bidirectional', 'port-oper-state': 'InService', 'port-qual': 'switch-network', 'rate': '200', 'supported-interface-capability': ['org-openroadm-port-types:if-otsi-otsigroup'], 'supported-operational-mode': ['OR-W-100G-oFEC-31.6Gbd', 'OR-W-200G-oFEC-31.6Gbd'], 'supporting-circuit-pack-name': '1/2/2-PLUG-NET', 'supporting-port': 'L1', 'xpdr-type': 'mpdr'}] 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:105: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_03_get_portmapping_network1 18:02:00 ___________ TestTransportPCEOtnRenderer.test_04_check_interface_otsi ___________ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_04_check_interface_otsi(self): 18:02:00 # pylint: disable=line-too-long 18:02:00 response = test_utils.check_node_attribute_request("XPDR-A2", "interface", "XPDR2-NETWORK1-755:768") 18:02:00 > self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 E AssertionError: 409 != 200 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:110: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_04_check_interface_otsi 18:02:00 __________ TestTransportPCEOtnRenderer.test_05_check_interface_otsig ___________ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_05_check_interface_otsig(self): 18:02:00 response = test_utils.check_node_attribute_request( 18:02:00 "XPDR-A2", "interface", "XPDR2-NETWORK1-OTSIGROUP-200G") 18:02:00 > self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 E AssertionError: 409 != 200 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:135: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_05_check_interface_otsig 18:02:00 __________ TestTransportPCEOtnRenderer.test_06_check_interface_otuc2 ___________ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_06_check_interface_otuc2(self): 18:02:00 response = test_utils.check_node_attribute_request( 18:02:00 "XPDR-A2", "interface", "XPDR2-NETWORK1-OTUC2") 18:02:00 > self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 E AssertionError: 409 != 200 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:154: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_06_check_interface_otuc2 18:02:00 ______ TestTransportPCEOtnRenderer.test_07_otn_service_path_create_oduc2 _______ 18:02:00 18:02:00 self = 18:02:00 conn = 18:02:00 method = 'POST' 18:02:00 url = '/rests/operations/transportpce-device-renderer:otn-service-path' 18:02:00 body = '{"input": {"service-name": "service_ODUC2", "operation": "create", "service-rate": "200", "service-format": "ODU", "nodes": [{"node-id": "XPDR-A2", "network-tp": "XPDR2-NETWORK1"}]}}' 18:02:00 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '182', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 18:02:00 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 18:02:00 timeout = Timeout(connect=30, read=30, total=None), chunked = False 18:02:00 response_conn = 18:02:00 preload_content = False, decode_content = False, enforce_content_length = True 18:02:00 18:02:00 def _make_request( 18:02:00 self, 18:02:00 conn: BaseHTTPConnection, 18:02:00 method: str, 18:02:00 url: str, 18:02:00 body: _TYPE_BODY | None = None, 18:02:00 headers: typing.Mapping[str, str] | None = None, 18:02:00 retries: Retry | None = None, 18:02:00 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 18:02:00 chunked: bool = False, 18:02:00 response_conn: BaseHTTPConnection | None = None, 18:02:00 preload_content: bool = True, 18:02:00 decode_content: bool = True, 18:02:00 enforce_content_length: bool = True, 18:02:00 ) -> BaseHTTPResponse: 18:02:00 """ 18:02:00 Perform a request on a given urllib connection object taken from our 18:02:00 pool. 18:02:00 18:02:00 :param conn: 18:02:00 a connection from one of our connection pools 18:02:00 18:02:00 :param method: 18:02:00 HTTP request method (such as GET, POST, PUT, etc.) 18:02:00 18:02:00 :param url: 18:02:00 The URL to perform the request on. 18:02:00 18:02:00 :param body: 18:02:00 Data to send in the request body, either :class:`str`, :class:`bytes`, 18:02:00 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 18:02:00 18:02:00 :param headers: 18:02:00 Dictionary of custom headers to send, such as User-Agent, 18:02:00 If-None-Match, etc. If None, pool headers are used. If provided, 18:02:00 these headers completely replace any pool-specific headers. 18:02:00 18:02:00 :param retries: 18:02:00 Configure the number of retries to allow before raising a 18:02:00 :class:`~urllib3.exceptions.MaxRetryError` exception. 18:02:00 18:02:00 Pass ``None`` to retry until you receive a response. Pass a 18:02:00 :class:`~urllib3.util.retry.Retry` object for fine-grained control 18:02:00 over different types of retries. 18:02:00 Pass an integer number to retry connection errors that many times, 18:02:00 but no other types of errors. Pass zero to never retry. 18:02:00 18:02:00 If ``False``, then retries are disabled and any exception is raised 18:02:00 immediately. Also, instead of raising a MaxRetryError on redirects, 18:02:00 the redirect response will be returned. 18:02:00 18:02:00 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 18:02:00 18:02:00 :param timeout: 18:02:00 If specified, overrides the default timeout for this one 18:02:00 request. It may be a float (in seconds) or an instance of 18:02:00 :class:`urllib3.util.Timeout`. 18:02:00 18:02:00 :param chunked: 18:02:00 If True, urllib3 will send the body using chunked transfer 18:02:00 encoding. Otherwise, urllib3 will send the body using the standard 18:02:00 content-length form. Defaults to False. 18:02:00 18:02:00 :param response_conn: 18:02:00 Set this to ``None`` if you will handle releasing the connection or 18:02:00 set the connection to have the response release it. 18:02:00 18:02:00 :param preload_content: 18:02:00 If True, the response's body will be preloaded during construction. 18:02:00 18:02:00 :param decode_content: 18:02:00 If True, will attempt to decode the body based on the 18:02:00 'content-encoding' header. 18:02:00 18:02:00 :param enforce_content_length: 18:02:00 Enforce content length checking. Body returned by server must match 18:02:00 value of Content-Length header, if present. Otherwise, raise error. 18:02:00 """ 18:02:00 self.num_requests += 1 18:02:00 18:02:00 timeout_obj = self._get_timeout(timeout) 18:02:00 timeout_obj.start_connect() 18:02:00 conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 18:02:00 18:02:00 try: 18:02:00 # Trigger any extra validation we need to do. 18:02:00 try: 18:02:00 self._validate_conn(conn) 18:02:00 except (SocketTimeout, BaseSSLError) as e: 18:02:00 self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 18:02:00 raise 18:02:00 18:02:00 # _validate_conn() starts the connection to an HTTPS proxy 18:02:00 # so we need to wrap errors with 'ProxyError' here too. 18:02:00 except ( 18:02:00 OSError, 18:02:00 NewConnectionError, 18:02:00 TimeoutError, 18:02:00 BaseSSLError, 18:02:00 CertificateError, 18:02:00 SSLError, 18:02:00 ) as e: 18:02:00 new_e: Exception = e 18:02:00 if isinstance(e, (BaseSSLError, CertificateError)): 18:02:00 new_e = SSLError(e) 18:02:00 # If the connection didn't successfully connect to it's proxy 18:02:00 # then there 18:02:00 if isinstance( 18:02:00 new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 18:02:00 ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 18:02:00 new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 18:02:00 raise new_e 18:02:00 18:02:00 # conn.request() calls http.client.*.request, not the method in 18:02:00 # urllib3.request. It also calls makefile (recv) on the socket. 18:02:00 try: 18:02:00 conn.request( 18:02:00 method, 18:02:00 url, 18:02:00 body=body, 18:02:00 headers=headers, 18:02:00 chunked=chunked, 18:02:00 preload_content=preload_content, 18:02:00 decode_content=decode_content, 18:02:00 enforce_content_length=enforce_content_length, 18:02:00 ) 18:02:00 18:02:00 # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 18:02:00 # legitimately able to close the connection after sending a valid response. 18:02:00 # With this behaviour, the received response is still readable. 18:02:00 except BrokenPipeError: 18:02:00 pass 18:02:00 except OSError as e: 18:02:00 # MacOS/Linux 18:02:00 # EPROTOTYPE and ECONNRESET are needed on macOS 18:02:00 # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 18:02:00 # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 18:02:00 if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 18:02:00 raise 18:02:00 18:02:00 # Reset the timeout for the recv() on the socket 18:02:00 read_timeout = timeout_obj.read_timeout 18:02:00 18:02:00 if not conn.is_closed: 18:02:00 # In Python 3 socket.py will catch EAGAIN and return None when you 18:02:00 # try and read into the file pointer created by http.client, which 18:02:00 # instead raises a BadStatusLine exception. Instead of catching 18:02:00 # the exception and assuming all BadStatusLine exceptions are read 18:02:00 # timeouts, check for a zero timeout before making the request. 18:02:00 if read_timeout == 0: 18:02:00 raise ReadTimeoutError( 18:02:00 self, url, f"Read timed out. (read timeout={read_timeout})" 18:02:00 ) 18:02:00 conn.timeout = read_timeout 18:02:00 18:02:00 # Receive the response from the server 18:02:00 try: 18:02:00 > response = conn.getresponse() 18:02:00 ^^^^^^^^^^^^^^^^^^ 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:534: 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connection.py:571: in getresponse 18:02:00 httplib_response = super().getresponse() 18:02:00 ^^^^^^^^^^^^^^^^^^^^^ 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1395: in getresponse 18:02:00 response.begin() 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:325: in begin 18:02:00 version, status, reason = self._read_status() 18:02:00 ^^^^^^^^^^^^^^^^^^^ 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:286: in _read_status 18:02:00 line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 18:02:00 self = 18:02:00 b = 18:02:00 18:02:00 def readinto(self, b): 18:02:00 """Read up to len(b) bytes into the writable buffer *b* and return 18:02:00 the number of bytes read. If the socket is non-blocking and no bytes 18:02:00 are available, None is returned. 18:02:00 18:02:00 If *b* is non-empty, a 0 return value indicates that the connection 18:02:00 was shutdown at the other end. 18:02:00 """ 18:02:00 self._checkClosed() 18:02:00 self._checkReadable() 18:02:00 if self._timeout_occurred: 18:02:00 raise OSError("cannot read from timed out object") 18:02:00 while True: 18:02:00 try: 18:02:00 > return self._sock.recv_into(b) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 E TimeoutError: timed out 18:02:00 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/socket.py:718: TimeoutError 18:02:00 18:02:00 The above exception was the direct cause of the following exception: 18:02:00 18:02:00 self = 18:02:00 request = , stream = False 18:02:00 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 18:02:00 proxies = OrderedDict() 18:02:00 18:02:00 def send( 18:02:00 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 18:02:00 ): 18:02:00 """Sends PreparedRequest object. Returns Response object. 18:02:00 18:02:00 :param request: The :class:`PreparedRequest ` being sent. 18:02:00 :param stream: (optional) Whether to stream the request content. 18:02:00 :param timeout: (optional) How long to wait for the server to send 18:02:00 data before giving up, as a float, or a :ref:`(connect timeout, 18:02:00 read timeout) ` tuple. 18:02:00 :type timeout: float or tuple or urllib3 Timeout object 18:02:00 :param verify: (optional) Either a boolean, in which case it controls whether 18:02:00 we verify the server's TLS certificate, or a string, in which case it 18:02:00 must be a path to a CA bundle to use 18:02:00 :param cert: (optional) Any user-provided SSL certificate to be trusted. 18:02:00 :param proxies: (optional) The proxies dictionary to apply to the request. 18:02:00 :rtype: requests.Response 18:02:00 """ 18:02:00 18:02:00 try: 18:02:00 conn = self.get_connection_with_tls_context( 18:02:00 request, verify, proxies=proxies, cert=cert 18:02:00 ) 18:02:00 except LocationValueError as e: 18:02:00 raise InvalidURL(e, request=request) 18:02:00 18:02:00 self.cert_verify(conn, request.url, verify, cert) 18:02:00 url = self.request_url(request, proxies) 18:02:00 self.add_headers( 18:02:00 request, 18:02:00 stream=stream, 18:02:00 timeout=timeout, 18:02:00 verify=verify, 18:02:00 cert=cert, 18:02:00 proxies=proxies, 18:02:00 ) 18:02:00 18:02:00 chunked = not (request.body is None or "Content-Length" in request.headers) 18:02:00 18:02:00 if isinstance(timeout, tuple): 18:02:00 try: 18:02:00 connect, read = timeout 18:02:00 timeout = TimeoutSauce(connect=connect, read=read) 18:02:00 except ValueError: 18:02:00 raise ValueError( 18:02:00 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 18:02:00 f"or a single float to set both timeouts to the same value." 18:02:00 ) 18:02:00 elif isinstance(timeout, TimeoutSauce): 18:02:00 pass 18:02:00 else: 18:02:00 timeout = TimeoutSauce(connect=timeout, read=timeout) 18:02:00 18:02:00 try: 18:02:00 > resp = conn.urlopen( 18:02:00 method=request.method, 18:02:00 url=url, 18:02:00 body=request.body, 18:02:00 headers=request.headers, 18:02:00 redirect=False, 18:02:00 assert_same_host=False, 18:02:00 preload_content=False, 18:02:00 decode_content=False, 18:02:00 retries=self.max_retries, 18:02:00 timeout=timeout, 18:02:00 chunked=chunked, 18:02:00 ) 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/adapters.py:644: 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 18:02:00 retries = retries.increment( 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/util/retry.py:490: in increment 18:02:00 raise reraise(type(error), error, _stacktrace) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/util/util.py:39: in reraise 18:02:00 raise value 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:787: in urlopen 18:02:00 response = self._make_request( 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:536: in _make_request 18:02:00 self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 18:02:00 self = 18:02:00 err = TimeoutError('timed out') 18:02:00 url = '/rests/operations/transportpce-device-renderer:otn-service-path' 18:02:00 timeout_value = 30 18:02:00 18:02:00 def _raise_timeout( 18:02:00 self, 18:02:00 err: BaseSSLError | OSError | SocketTimeout, 18:02:00 url: str, 18:02:00 timeout_value: _TYPE_TIMEOUT | None, 18:02:00 ) -> None: 18:02:00 """Is the error actually a timeout? Will raise a ReadTimeout or pass""" 18:02:00 18:02:00 if isinstance(err, SocketTimeout): 18:02:00 > raise ReadTimeoutError( 18:02:00 self, url, f"Read timed out. (read timeout={timeout_value})" 18:02:00 ) from err 18:02:00 E urllib3.exceptions.ReadTimeoutError: HTTPConnectionPool(host='localhost', port=8184): Read timed out. (read timeout=30) 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:367: ReadTimeoutError 18:02:00 18:02:00 During handling of the above exception, another exception occurred: 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_07_otn_service_path_create_oduc2(self): 18:02:00 > response = test_utils.transportpce_api_rpc_request( 18:02:00 'transportpce-device-renderer', 'otn-service-path', 18:02:00 { 18:02:00 'service-name': 'service_ODUC2', 18:02:00 'operation': 'create', 18:02:00 'service-rate': '200', 18:02:00 'service-format': 'ODU', 18:02:00 'nodes': [{'node-id': 'XPDR-A2', 'network-tp': 'XPDR2-NETWORK1'}] 18:02:00 }) 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:175: 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 transportpce_tests/common/test_utils.py:751: in transportpce_api_rpc_request 18:02:00 response = post_request(url, data) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 transportpce_tests/common/test_utils.py:143: in post_request 18:02:00 return requests.request( 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/api.py:59: in request 18:02:00 return session.request(method=method, url=url, **kwargs) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/sessions.py:589: in request 18:02:00 resp = self.send(prep, **send_kwargs) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/sessions.py:703: in send 18:02:00 r = adapter.send(request, **kwargs) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 18:02:00 self = 18:02:00 request = , stream = False 18:02:00 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 18:02:00 proxies = OrderedDict() 18:02:00 18:02:00 def send( 18:02:00 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 18:02:00 ): 18:02:00 """Sends PreparedRequest object. Returns Response object. 18:02:00 18:02:00 :param request: The :class:`PreparedRequest ` being sent. 18:02:00 :param stream: (optional) Whether to stream the request content. 18:02:00 :param timeout: (optional) How long to wait for the server to send 18:02:00 data before giving up, as a float, or a :ref:`(connect timeout, 18:02:00 read timeout) ` tuple. 18:02:00 :type timeout: float or tuple or urllib3 Timeout object 18:02:00 :param verify: (optional) Either a boolean, in which case it controls whether 18:02:00 we verify the server's TLS certificate, or a string, in which case it 18:02:00 must be a path to a CA bundle to use 18:02:00 :param cert: (optional) Any user-provided SSL certificate to be trusted. 18:02:00 :param proxies: (optional) The proxies dictionary to apply to the request. 18:02:00 :rtype: requests.Response 18:02:00 """ 18:02:00 18:02:00 try: 18:02:00 conn = self.get_connection_with_tls_context( 18:02:00 request, verify, proxies=proxies, cert=cert 18:02:00 ) 18:02:00 except LocationValueError as e: 18:02:00 raise InvalidURL(e, request=request) 18:02:00 18:02:00 self.cert_verify(conn, request.url, verify, cert) 18:02:00 url = self.request_url(request, proxies) 18:02:00 self.add_headers( 18:02:00 request, 18:02:00 stream=stream, 18:02:00 timeout=timeout, 18:02:00 verify=verify, 18:02:00 cert=cert, 18:02:00 proxies=proxies, 18:02:00 ) 18:02:00 18:02:00 chunked = not (request.body is None or "Content-Length" in request.headers) 18:02:00 18:02:00 if isinstance(timeout, tuple): 18:02:00 try: 18:02:00 connect, read = timeout 18:02:00 timeout = TimeoutSauce(connect=connect, read=read) 18:02:00 except ValueError: 18:02:00 raise ValueError( 18:02:00 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 18:02:00 f"or a single float to set both timeouts to the same value." 18:02:00 ) 18:02:00 elif isinstance(timeout, TimeoutSauce): 18:02:00 pass 18:02:00 else: 18:02:00 timeout = TimeoutSauce(connect=timeout, read=timeout) 18:02:00 18:02:00 try: 18:02:00 resp = conn.urlopen( 18:02:00 method=request.method, 18:02:00 url=url, 18:02:00 body=request.body, 18:02:00 headers=request.headers, 18:02:00 redirect=False, 18:02:00 assert_same_host=False, 18:02:00 preload_content=False, 18:02:00 decode_content=False, 18:02:00 retries=self.max_retries, 18:02:00 timeout=timeout, 18:02:00 chunked=chunked, 18:02:00 ) 18:02:00 18:02:00 except (ProtocolError, OSError) as err: 18:02:00 raise ConnectionError(err, request=request) 18:02:00 18:02:00 except MaxRetryError as e: 18:02:00 if isinstance(e.reason, ConnectTimeoutError): 18:02:00 # TODO: Remove this in 3.0.0: see #2811 18:02:00 if not isinstance(e.reason, NewConnectionError): 18:02:00 raise ConnectTimeout(e, request=request) 18:02:00 18:02:00 if isinstance(e.reason, ResponseError): 18:02:00 raise RetryError(e, request=request) 18:02:00 18:02:00 if isinstance(e.reason, _ProxyError): 18:02:00 raise ProxyError(e, request=request) 18:02:00 18:02:00 if isinstance(e.reason, _SSLError): 18:02:00 # This branch is for urllib3 v1.22 and later. 18:02:00 raise SSLError(e, request=request) 18:02:00 18:02:00 raise ConnectionError(e, request=request) 18:02:00 18:02:00 except ClosedPoolError as e: 18:02:00 raise ConnectionError(e, request=request) 18:02:00 18:02:00 except _ProxyError as e: 18:02:00 raise ProxyError(e) 18:02:00 18:02:00 except (_SSLError, _HTTPError) as e: 18:02:00 if isinstance(e, _SSLError): 18:02:00 # This branch is for urllib3 versions earlier than v1.22 18:02:00 raise SSLError(e, request=request) 18:02:00 elif isinstance(e, ReadTimeoutError): 18:02:00 > raise ReadTimeout(e, request=request) 18:02:00 E requests.exceptions.ReadTimeout: HTTPConnectionPool(host='localhost', port=8184): Read timed out. (read timeout=30) 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/adapters.py:690: ReadTimeout 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_07_otn_service_path_create_oduc2 18:02:00 _________ TestTransportPCEOtnRenderer.test_08_get_portmapping_network1 _________ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_08_get_portmapping_network1(self): 18:02:00 response = test_utils.get_portmapping_node_attr("XPDR-A2", "mapping", "XPDR2-NETWORK1") 18:02:00 self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 self.NETWORK2_CHECK_DICT["supporting-oducn"] = "XPDR2-NETWORK1-ODUC2" 18:02:00 expected_sorted = test_utils.recursive_sort(self.NETWORK2_CHECK_DICT) 18:02:00 response_sorted = [ 18:02:00 test_utils.recursive_sort(item) for item in response['mapping'] 18:02:00 ] 18:02:00 > self.assertIn(expected_sorted, response_sorted) 18:02:00 E AssertionError: {'lcp-hash-val': 'LY9PxYJqUbw=', 'logical-connection-point': 'XPDR2-NETWORK1', 'port-admin-state': 'InService', 'port-direction': 'bidirectional', 'port-oper-state': 'InService', 'port-qual': 'switch-network', 'rate': '200', 'supported-interface-capability': ['org-openroadm-port-types:if-otsi-otsigroup'], 'supported-operational-mode': ['OR-W-100G-oFEC-31.6Gbd', 'OR-W-200G-oFEC-31.6Gbd'], 'supporting-circuit-pack-name': '1/2/2-PLUG-NET', 'supporting-oducn': 'XPDR2-NETWORK1-ODUC2', 'supporting-otucn': 'XPDR2-NETWORK1-OTUC2', 'supporting-port': 'L1', 'xpdr-type': 'mpdr'} not found in [{'lcp-hash-val': 'LY9PxYJqUbw=', 'logical-connection-point': 'XPDR2-NETWORK1', 'port-admin-state': 'InService', 'port-direction': 'bidirectional', 'port-oper-state': 'InService', 'port-qual': 'switch-network', 'rate': '200', 'supported-interface-capability': ['org-openroadm-port-types:if-otsi-otsigroup'], 'supported-operational-mode': ['OR-W-100G-oFEC-31.6Gbd', 'OR-W-200G-oFEC-31.6Gbd'], 'supporting-circuit-pack-name': '1/2/2-PLUG-NET', 'supporting-port': 'L1', 'xpdr-type': 'mpdr'}] 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:198: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_08_get_portmapping_network1 18:02:00 __________ TestTransportPCEOtnRenderer.test_09_check_interface_oduc2 ___________ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_09_check_interface_oduc2(self): 18:02:00 response = test_utils.check_node_attribute_request("XPDR-A2", "interface", "XPDR2-NETWORK1-ODUC2") 18:02:00 > self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 E AssertionError: 409 != 200 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:202: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_09_check_interface_oduc2 18:02:00 ______ TestTransportPCEOtnRenderer.test_10_otn_service_path_create_100ge _______ 18:02:00 18:02:00 self = 18:02:00 conn = 18:02:00 method = 'POST' 18:02:00 url = '/rests/operations/transportpce-device-renderer:otn-service-path' 18:02:00 body = '{"input": {"service-name": "service_Ethernet", "operation": "create", "service-rate": "100", "service-format": "Ether...2-CLIENT1", "network-tp": "XPDR2-NETWORK1"}], "ethernet-encoding": "eth encode", "opucn-trib-slots": ["1.1", "1.20"]}}' 18:02:00 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '292', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 18:02:00 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 18:02:00 timeout = Timeout(connect=30, read=30, total=None), chunked = False 18:02:00 response_conn = 18:02:00 preload_content = False, decode_content = False, enforce_content_length = True 18:02:00 18:02:00 def _make_request( 18:02:00 self, 18:02:00 conn: BaseHTTPConnection, 18:02:00 method: str, 18:02:00 url: str, 18:02:00 body: _TYPE_BODY | None = None, 18:02:00 headers: typing.Mapping[str, str] | None = None, 18:02:00 retries: Retry | None = None, 18:02:00 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 18:02:00 chunked: bool = False, 18:02:00 response_conn: BaseHTTPConnection | None = None, 18:02:00 preload_content: bool = True, 18:02:00 decode_content: bool = True, 18:02:00 enforce_content_length: bool = True, 18:02:00 ) -> BaseHTTPResponse: 18:02:00 """ 18:02:00 Perform a request on a given urllib connection object taken from our 18:02:00 pool. 18:02:00 18:02:00 :param conn: 18:02:00 a connection from one of our connection pools 18:02:00 18:02:00 :param method: 18:02:00 HTTP request method (such as GET, POST, PUT, etc.) 18:02:00 18:02:00 :param url: 18:02:00 The URL to perform the request on. 18:02:00 18:02:00 :param body: 18:02:00 Data to send in the request body, either :class:`str`, :class:`bytes`, 18:02:00 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 18:02:00 18:02:00 :param headers: 18:02:00 Dictionary of custom headers to send, such as User-Agent, 18:02:00 If-None-Match, etc. If None, pool headers are used. If provided, 18:02:00 these headers completely replace any pool-specific headers. 18:02:00 18:02:00 :param retries: 18:02:00 Configure the number of retries to allow before raising a 18:02:00 :class:`~urllib3.exceptions.MaxRetryError` exception. 18:02:00 18:02:00 Pass ``None`` to retry until you receive a response. Pass a 18:02:00 :class:`~urllib3.util.retry.Retry` object for fine-grained control 18:02:00 over different types of retries. 18:02:00 Pass an integer number to retry connection errors that many times, 18:02:00 but no other types of errors. Pass zero to never retry. 18:02:00 18:02:00 If ``False``, then retries are disabled and any exception is raised 18:02:00 immediately. Also, instead of raising a MaxRetryError on redirects, 18:02:00 the redirect response will be returned. 18:02:00 18:02:00 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 18:02:00 18:02:00 :param timeout: 18:02:00 If specified, overrides the default timeout for this one 18:02:00 request. It may be a float (in seconds) or an instance of 18:02:00 :class:`urllib3.util.Timeout`. 18:02:00 18:02:00 :param chunked: 18:02:00 If True, urllib3 will send the body using chunked transfer 18:02:00 encoding. Otherwise, urllib3 will send the body using the standard 18:02:00 content-length form. Defaults to False. 18:02:00 18:02:00 :param response_conn: 18:02:00 Set this to ``None`` if you will handle releasing the connection or 18:02:00 set the connection to have the response release it. 18:02:00 18:02:00 :param preload_content: 18:02:00 If True, the response's body will be preloaded during construction. 18:02:00 18:02:00 :param decode_content: 18:02:00 If True, will attempt to decode the body based on the 18:02:00 'content-encoding' header. 18:02:00 18:02:00 :param enforce_content_length: 18:02:00 Enforce content length checking. Body returned by server must match 18:02:00 value of Content-Length header, if present. Otherwise, raise error. 18:02:00 """ 18:02:00 self.num_requests += 1 18:02:00 18:02:00 timeout_obj = self._get_timeout(timeout) 18:02:00 timeout_obj.start_connect() 18:02:00 conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 18:02:00 18:02:00 try: 18:02:00 # Trigger any extra validation we need to do. 18:02:00 try: 18:02:00 self._validate_conn(conn) 18:02:00 except (SocketTimeout, BaseSSLError) as e: 18:02:00 self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 18:02:00 raise 18:02:00 18:02:00 # _validate_conn() starts the connection to an HTTPS proxy 18:02:00 # so we need to wrap errors with 'ProxyError' here too. 18:02:00 except ( 18:02:00 OSError, 18:02:00 NewConnectionError, 18:02:00 TimeoutError, 18:02:00 BaseSSLError, 18:02:00 CertificateError, 18:02:00 SSLError, 18:02:00 ) as e: 18:02:00 new_e: Exception = e 18:02:00 if isinstance(e, (BaseSSLError, CertificateError)): 18:02:00 new_e = SSLError(e) 18:02:00 # If the connection didn't successfully connect to it's proxy 18:02:00 # then there 18:02:00 if isinstance( 18:02:00 new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 18:02:00 ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 18:02:00 new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 18:02:00 raise new_e 18:02:00 18:02:00 # conn.request() calls http.client.*.request, not the method in 18:02:00 # urllib3.request. It also calls makefile (recv) on the socket. 18:02:00 try: 18:02:00 conn.request( 18:02:00 method, 18:02:00 url, 18:02:00 body=body, 18:02:00 headers=headers, 18:02:00 chunked=chunked, 18:02:00 preload_content=preload_content, 18:02:00 decode_content=decode_content, 18:02:00 enforce_content_length=enforce_content_length, 18:02:00 ) 18:02:00 18:02:00 # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 18:02:00 # legitimately able to close the connection after sending a valid response. 18:02:00 # With this behaviour, the received response is still readable. 18:02:00 except BrokenPipeError: 18:02:00 pass 18:02:00 except OSError as e: 18:02:00 # MacOS/Linux 18:02:00 # EPROTOTYPE and ECONNRESET are needed on macOS 18:02:00 # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 18:02:00 # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 18:02:00 if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 18:02:00 raise 18:02:00 18:02:00 # Reset the timeout for the recv() on the socket 18:02:00 read_timeout = timeout_obj.read_timeout 18:02:00 18:02:00 if not conn.is_closed: 18:02:00 # In Python 3 socket.py will catch EAGAIN and return None when you 18:02:00 # try and read into the file pointer created by http.client, which 18:02:00 # instead raises a BadStatusLine exception. Instead of catching 18:02:00 # the exception and assuming all BadStatusLine exceptions are read 18:02:00 # timeouts, check for a zero timeout before making the request. 18:02:00 if read_timeout == 0: 18:02:00 raise ReadTimeoutError( 18:02:00 self, url, f"Read timed out. (read timeout={read_timeout})" 18:02:00 ) 18:02:00 conn.timeout = read_timeout 18:02:00 18:02:00 # Receive the response from the server 18:02:00 try: 18:02:00 > response = conn.getresponse() 18:02:00 ^^^^^^^^^^^^^^^^^^ 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:534: 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connection.py:571: in getresponse 18:02:00 httplib_response = super().getresponse() 18:02:00 ^^^^^^^^^^^^^^^^^^^^^ 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1395: in getresponse 18:02:00 response.begin() 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:325: in begin 18:02:00 version, status, reason = self._read_status() 18:02:00 ^^^^^^^^^^^^^^^^^^^ 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:286: in _read_status 18:02:00 line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 18:02:00 self = 18:02:00 b = 18:02:00 18:02:00 def readinto(self, b): 18:02:00 """Read up to len(b) bytes into the writable buffer *b* and return 18:02:00 the number of bytes read. If the socket is non-blocking and no bytes 18:02:00 are available, None is returned. 18:02:00 18:02:00 If *b* is non-empty, a 0 return value indicates that the connection 18:02:00 was shutdown at the other end. 18:02:00 """ 18:02:00 self._checkClosed() 18:02:00 self._checkReadable() 18:02:00 if self._timeout_occurred: 18:02:00 raise OSError("cannot read from timed out object") 18:02:00 while True: 18:02:00 try: 18:02:00 > return self._sock.recv_into(b) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 E TimeoutError: timed out 18:02:00 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/socket.py:718: TimeoutError 18:02:00 18:02:00 The above exception was the direct cause of the following exception: 18:02:00 18:02:00 self = 18:02:00 request = , stream = False 18:02:00 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 18:02:00 proxies = OrderedDict() 18:02:00 18:02:00 def send( 18:02:00 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 18:02:00 ): 18:02:00 """Sends PreparedRequest object. Returns Response object. 18:02:00 18:02:00 :param request: The :class:`PreparedRequest ` being sent. 18:02:00 :param stream: (optional) Whether to stream the request content. 18:02:00 :param timeout: (optional) How long to wait for the server to send 18:02:00 data before giving up, as a float, or a :ref:`(connect timeout, 18:02:00 read timeout) ` tuple. 18:02:00 :type timeout: float or tuple or urllib3 Timeout object 18:02:00 :param verify: (optional) Either a boolean, in which case it controls whether 18:02:00 we verify the server's TLS certificate, or a string, in which case it 18:02:00 must be a path to a CA bundle to use 18:02:00 :param cert: (optional) Any user-provided SSL certificate to be trusted. 18:02:00 :param proxies: (optional) The proxies dictionary to apply to the request. 18:02:00 :rtype: requests.Response 18:02:00 """ 18:02:00 18:02:00 try: 18:02:00 conn = self.get_connection_with_tls_context( 18:02:00 request, verify, proxies=proxies, cert=cert 18:02:00 ) 18:02:00 except LocationValueError as e: 18:02:00 raise InvalidURL(e, request=request) 18:02:00 18:02:00 self.cert_verify(conn, request.url, verify, cert) 18:02:00 url = self.request_url(request, proxies) 18:02:00 self.add_headers( 18:02:00 request, 18:02:00 stream=stream, 18:02:00 timeout=timeout, 18:02:00 verify=verify, 18:02:00 cert=cert, 18:02:00 proxies=proxies, 18:02:00 ) 18:02:00 18:02:00 chunked = not (request.body is None or "Content-Length" in request.headers) 18:02:00 18:02:00 if isinstance(timeout, tuple): 18:02:00 try: 18:02:00 connect, read = timeout 18:02:00 timeout = TimeoutSauce(connect=connect, read=read) 18:02:00 except ValueError: 18:02:00 raise ValueError( 18:02:00 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 18:02:00 f"or a single float to set both timeouts to the same value." 18:02:00 ) 18:02:00 elif isinstance(timeout, TimeoutSauce): 18:02:00 pass 18:02:00 else: 18:02:00 timeout = TimeoutSauce(connect=timeout, read=timeout) 18:02:00 18:02:00 try: 18:02:00 > resp = conn.urlopen( 18:02:00 method=request.method, 18:02:00 url=url, 18:02:00 body=request.body, 18:02:00 headers=request.headers, 18:02:00 redirect=False, 18:02:00 assert_same_host=False, 18:02:00 preload_content=False, 18:02:00 decode_content=False, 18:02:00 retries=self.max_retries, 18:02:00 timeout=timeout, 18:02:00 chunked=chunked, 18:02:00 ) 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/adapters.py:644: 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 18:02:00 retries = retries.increment( 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/util/retry.py:490: in increment 18:02:00 raise reraise(type(error), error, _stacktrace) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/util/util.py:39: in reraise 18:02:00 raise value 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:787: in urlopen 18:02:00 response = self._make_request( 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:536: in _make_request 18:02:00 self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 18:02:00 self = 18:02:00 err = TimeoutError('timed out') 18:02:00 url = '/rests/operations/transportpce-device-renderer:otn-service-path' 18:02:00 timeout_value = 30 18:02:00 18:02:00 def _raise_timeout( 18:02:00 self, 18:02:00 err: BaseSSLError | OSError | SocketTimeout, 18:02:00 url: str, 18:02:00 timeout_value: _TYPE_TIMEOUT | None, 18:02:00 ) -> None: 18:02:00 """Is the error actually a timeout? Will raise a ReadTimeout or pass""" 18:02:00 18:02:00 if isinstance(err, SocketTimeout): 18:02:00 > raise ReadTimeoutError( 18:02:00 self, url, f"Read timed out. (read timeout={timeout_value})" 18:02:00 ) from err 18:02:00 E urllib3.exceptions.ReadTimeoutError: HTTPConnectionPool(host='localhost', port=8184): Read timed out. (read timeout=30) 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:367: ReadTimeoutError 18:02:00 18:02:00 During handling of the above exception, another exception occurred: 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_10_otn_service_path_create_100ge(self): 18:02:00 > response = test_utils.transportpce_api_rpc_request( 18:02:00 'transportpce-device-renderer', 'otn-service-path', 18:02:00 { 18:02:00 'service-name': 'service_Ethernet', 18:02:00 'operation': 'create', 18:02:00 'service-rate': '100', 18:02:00 'service-format': 'Ethernet', 18:02:00 'nodes': [{'node-id': 'XPDR-A2', 'client-tp': 'XPDR2-CLIENT1', 'network-tp': 'XPDR2-NETWORK1'}], 18:02:00 'ethernet-encoding': 'eth encode', 18:02:00 'opucn-trib-slots': ['1.1', '1.20'] 18:02:00 }) 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:232: 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 transportpce_tests/common/test_utils.py:751: in transportpce_api_rpc_request 18:02:00 response = post_request(url, data) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 transportpce_tests/common/test_utils.py:143: in post_request 18:02:00 return requests.request( 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/api.py:59: in request 18:02:00 return session.request(method=method, url=url, **kwargs) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/sessions.py:589: in request 18:02:00 resp = self.send(prep, **send_kwargs) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/sessions.py:703: in send 18:02:00 r = adapter.send(request, **kwargs) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 18:02:00 self = 18:02:00 request = , stream = False 18:02:00 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 18:02:00 proxies = OrderedDict() 18:02:00 18:02:00 def send( 18:02:00 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 18:02:00 ): 18:02:00 """Sends PreparedRequest object. Returns Response object. 18:02:00 18:02:00 :param request: The :class:`PreparedRequest ` being sent. 18:02:00 :param stream: (optional) Whether to stream the request content. 18:02:00 :param timeout: (optional) How long to wait for the server to send 18:02:00 data before giving up, as a float, or a :ref:`(connect timeout, 18:02:00 read timeout) ` tuple. 18:02:00 :type timeout: float or tuple or urllib3 Timeout object 18:02:00 :param verify: (optional) Either a boolean, in which case it controls whether 18:02:00 we verify the server's TLS certificate, or a string, in which case it 18:02:00 must be a path to a CA bundle to use 18:02:00 :param cert: (optional) Any user-provided SSL certificate to be trusted. 18:02:00 :param proxies: (optional) The proxies dictionary to apply to the request. 18:02:00 :rtype: requests.Response 18:02:00 """ 18:02:00 18:02:00 try: 18:02:00 conn = self.get_connection_with_tls_context( 18:02:00 request, verify, proxies=proxies, cert=cert 18:02:00 ) 18:02:00 except LocationValueError as e: 18:02:00 raise InvalidURL(e, request=request) 18:02:00 18:02:00 self.cert_verify(conn, request.url, verify, cert) 18:02:00 url = self.request_url(request, proxies) 18:02:00 self.add_headers( 18:02:00 request, 18:02:00 stream=stream, 18:02:00 timeout=timeout, 18:02:00 verify=verify, 18:02:00 cert=cert, 18:02:00 proxies=proxies, 18:02:00 ) 18:02:00 18:02:00 chunked = not (request.body is None or "Content-Length" in request.headers) 18:02:00 18:02:00 if isinstance(timeout, tuple): 18:02:00 try: 18:02:00 connect, read = timeout 18:02:00 timeout = TimeoutSauce(connect=connect, read=read) 18:02:00 except ValueError: 18:02:00 raise ValueError( 18:02:00 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 18:02:00 f"or a single float to set both timeouts to the same value." 18:02:00 ) 18:02:00 elif isinstance(timeout, TimeoutSauce): 18:02:00 pass 18:02:00 else: 18:02:00 timeout = TimeoutSauce(connect=timeout, read=timeout) 18:02:00 18:02:00 try: 18:02:00 resp = conn.urlopen( 18:02:00 method=request.method, 18:02:00 url=url, 18:02:00 body=request.body, 18:02:00 headers=request.headers, 18:02:00 redirect=False, 18:02:00 assert_same_host=False, 18:02:00 preload_content=False, 18:02:00 decode_content=False, 18:02:00 retries=self.max_retries, 18:02:00 timeout=timeout, 18:02:00 chunked=chunked, 18:02:00 ) 18:02:00 18:02:00 except (ProtocolError, OSError) as err: 18:02:00 raise ConnectionError(err, request=request) 18:02:00 18:02:00 except MaxRetryError as e: 18:02:00 if isinstance(e.reason, ConnectTimeoutError): 18:02:00 # TODO: Remove this in 3.0.0: see #2811 18:02:00 if not isinstance(e.reason, NewConnectionError): 18:02:00 raise ConnectTimeout(e, request=request) 18:02:00 18:02:00 if isinstance(e.reason, ResponseError): 18:02:00 raise RetryError(e, request=request) 18:02:00 18:02:00 if isinstance(e.reason, _ProxyError): 18:02:00 raise ProxyError(e, request=request) 18:02:00 18:02:00 if isinstance(e.reason, _SSLError): 18:02:00 # This branch is for urllib3 v1.22 and later. 18:02:00 raise SSLError(e, request=request) 18:02:00 18:02:00 raise ConnectionError(e, request=request) 18:02:00 18:02:00 except ClosedPoolError as e: 18:02:00 raise ConnectionError(e, request=request) 18:02:00 18:02:00 except _ProxyError as e: 18:02:00 raise ProxyError(e) 18:02:00 18:02:00 except (_SSLError, _HTTPError) as e: 18:02:00 if isinstance(e, _SSLError): 18:02:00 # This branch is for urllib3 versions earlier than v1.22 18:02:00 raise SSLError(e, request=request) 18:02:00 elif isinstance(e, ReadTimeoutError): 18:02:00 > raise ReadTimeout(e, request=request) 18:02:00 E requests.exceptions.ReadTimeout: HTTPConnectionPool(host='localhost', port=8184): Read timed out. (read timeout=30) 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/adapters.py:690: ReadTimeout 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_10_otn_service_path_create_100ge 18:02:00 _______ TestTransportPCEOtnRenderer.test_11_check_interface_100ge_client _______ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_11_check_interface_100ge_client(self): 18:02:00 response = test_utils.check_node_attribute_request( 18:02:00 "XPDR-A2", "interface", "XPDR2-CLIENT1-ETHERNET-100G") 18:02:00 > self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 E AssertionError: 409 != 200 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:257: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_11_check_interface_100ge_client 18:02:00 _______ TestTransportPCEOtnRenderer.test_12_check_interface_odu4_client ________ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_12_check_interface_odu4_client(self): 18:02:00 response = test_utils.check_node_attribute_request( 18:02:00 "XPDR-A2", "interface", "XPDR2-CLIENT1-ODU4:service_Ethernet") 18:02:00 > self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 E AssertionError: 409 != 200 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:274: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_12_check_interface_odu4_client 18:02:00 _______ TestTransportPCEOtnRenderer.test_13_check_interface_odu4_network _______ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_13_check_interface_odu4_network(self): 18:02:00 response = test_utils.check_node_attribute_request( 18:02:00 "XPDR-A2", "interface", "XPDR2-NETWORK1-ODU4:service_Ethernet") 18:02:00 > self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 E AssertionError: 409 != 200 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:298: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_13_check_interface_odu4_network 18:02:00 _______ TestTransportPCEOtnRenderer.test_14_check_odu_connection_xpdra2 ________ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_14_check_odu_connection_xpdra2(self): 18:02:00 response = test_utils.check_node_attribute_request( 18:02:00 "XPDR-A2", 18:02:00 "odu-connection", "XPDR2-CLIENT1-ODU4-x-XPDR2-NETWORK1-ODU4") 18:02:00 > self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 E AssertionError: 409 != 200 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:329: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_14_check_odu_connection_xpdra2 18:02:00 ________ TestTransportPCEOtnRenderer.test_26_service_path_create_otuc3 _________ 18:02:00 18:02:00 self = 18:02:00 conn = 18:02:00 method = 'POST' 18:02:00 url = '/rests/operations/transportpce-device-renderer:service-path' 18:02:00 body = '{"input": {"service-name": "service_OTUC3", "wave-number": "0", "modulation-format": "dp-qam8", "operation": "create"...75, "min-freq": 196.0375, "max-freq": 196.125, "lower-spectral-slot-number": 755, "higher-spectral-slot-number": 768}}' 18:02:00 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '336', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 18:02:00 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 18:02:00 timeout = Timeout(connect=30, read=30, total=None), chunked = False 18:02:00 response_conn = 18:02:00 preload_content = False, decode_content = False, enforce_content_length = True 18:02:00 18:02:00 def _make_request( 18:02:00 self, 18:02:00 conn: BaseHTTPConnection, 18:02:00 method: str, 18:02:00 url: str, 18:02:00 body: _TYPE_BODY | None = None, 18:02:00 headers: typing.Mapping[str, str] | None = None, 18:02:00 retries: Retry | None = None, 18:02:00 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 18:02:00 chunked: bool = False, 18:02:00 response_conn: BaseHTTPConnection | None = None, 18:02:00 preload_content: bool = True, 18:02:00 decode_content: bool = True, 18:02:00 enforce_content_length: bool = True, 18:02:00 ) -> BaseHTTPResponse: 18:02:00 """ 18:02:00 Perform a request on a given urllib connection object taken from our 18:02:00 pool. 18:02:00 18:02:00 :param conn: 18:02:00 a connection from one of our connection pools 18:02:00 18:02:00 :param method: 18:02:00 HTTP request method (such as GET, POST, PUT, etc.) 18:02:00 18:02:00 :param url: 18:02:00 The URL to perform the request on. 18:02:00 18:02:00 :param body: 18:02:00 Data to send in the request body, either :class:`str`, :class:`bytes`, 18:02:00 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 18:02:00 18:02:00 :param headers: 18:02:00 Dictionary of custom headers to send, such as User-Agent, 18:02:00 If-None-Match, etc. If None, pool headers are used. If provided, 18:02:00 these headers completely replace any pool-specific headers. 18:02:00 18:02:00 :param retries: 18:02:00 Configure the number of retries to allow before raising a 18:02:00 :class:`~urllib3.exceptions.MaxRetryError` exception. 18:02:00 18:02:00 Pass ``None`` to retry until you receive a response. Pass a 18:02:00 :class:`~urllib3.util.retry.Retry` object for fine-grained control 18:02:00 over different types of retries. 18:02:00 Pass an integer number to retry connection errors that many times, 18:02:00 but no other types of errors. Pass zero to never retry. 18:02:00 18:02:00 If ``False``, then retries are disabled and any exception is raised 18:02:00 immediately. Also, instead of raising a MaxRetryError on redirects, 18:02:00 the redirect response will be returned. 18:02:00 18:02:00 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 18:02:00 18:02:00 :param timeout: 18:02:00 If specified, overrides the default timeout for this one 18:02:00 request. It may be a float (in seconds) or an instance of 18:02:00 :class:`urllib3.util.Timeout`. 18:02:00 18:02:00 :param chunked: 18:02:00 If True, urllib3 will send the body using chunked transfer 18:02:00 encoding. Otherwise, urllib3 will send the body using the standard 18:02:00 content-length form. Defaults to False. 18:02:00 18:02:00 :param response_conn: 18:02:00 Set this to ``None`` if you will handle releasing the connection or 18:02:00 set the connection to have the response release it. 18:02:00 18:02:00 :param preload_content: 18:02:00 If True, the response's body will be preloaded during construction. 18:02:00 18:02:00 :param decode_content: 18:02:00 If True, will attempt to decode the body based on the 18:02:00 'content-encoding' header. 18:02:00 18:02:00 :param enforce_content_length: 18:02:00 Enforce content length checking. Body returned by server must match 18:02:00 value of Content-Length header, if present. Otherwise, raise error. 18:02:00 """ 18:02:00 self.num_requests += 1 18:02:00 18:02:00 timeout_obj = self._get_timeout(timeout) 18:02:00 timeout_obj.start_connect() 18:02:00 conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 18:02:00 18:02:00 try: 18:02:00 # Trigger any extra validation we need to do. 18:02:00 try: 18:02:00 self._validate_conn(conn) 18:02:00 except (SocketTimeout, BaseSSLError) as e: 18:02:00 self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 18:02:00 raise 18:02:00 18:02:00 # _validate_conn() starts the connection to an HTTPS proxy 18:02:00 # so we need to wrap errors with 'ProxyError' here too. 18:02:00 except ( 18:02:00 OSError, 18:02:00 NewConnectionError, 18:02:00 TimeoutError, 18:02:00 BaseSSLError, 18:02:00 CertificateError, 18:02:00 SSLError, 18:02:00 ) as e: 18:02:00 new_e: Exception = e 18:02:00 if isinstance(e, (BaseSSLError, CertificateError)): 18:02:00 new_e = SSLError(e) 18:02:00 # If the connection didn't successfully connect to it's proxy 18:02:00 # then there 18:02:00 if isinstance( 18:02:00 new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 18:02:00 ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 18:02:00 new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 18:02:00 raise new_e 18:02:00 18:02:00 # conn.request() calls http.client.*.request, not the method in 18:02:00 # urllib3.request. It also calls makefile (recv) on the socket. 18:02:00 try: 18:02:00 conn.request( 18:02:00 method, 18:02:00 url, 18:02:00 body=body, 18:02:00 headers=headers, 18:02:00 chunked=chunked, 18:02:00 preload_content=preload_content, 18:02:00 decode_content=decode_content, 18:02:00 enforce_content_length=enforce_content_length, 18:02:00 ) 18:02:00 18:02:00 # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 18:02:00 # legitimately able to close the connection after sending a valid response. 18:02:00 # With this behaviour, the received response is still readable. 18:02:00 except BrokenPipeError: 18:02:00 pass 18:02:00 except OSError as e: 18:02:00 # MacOS/Linux 18:02:00 # EPROTOTYPE and ECONNRESET are needed on macOS 18:02:00 # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 18:02:00 # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 18:02:00 if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 18:02:00 raise 18:02:00 18:02:00 # Reset the timeout for the recv() on the socket 18:02:00 read_timeout = timeout_obj.read_timeout 18:02:00 18:02:00 if not conn.is_closed: 18:02:00 # In Python 3 socket.py will catch EAGAIN and return None when you 18:02:00 # try and read into the file pointer created by http.client, which 18:02:00 # instead raises a BadStatusLine exception. Instead of catching 18:02:00 # the exception and assuming all BadStatusLine exceptions are read 18:02:00 # timeouts, check for a zero timeout before making the request. 18:02:00 if read_timeout == 0: 18:02:00 raise ReadTimeoutError( 18:02:00 self, url, f"Read timed out. (read timeout={read_timeout})" 18:02:00 ) 18:02:00 conn.timeout = read_timeout 18:02:00 18:02:00 # Receive the response from the server 18:02:00 try: 18:02:00 > response = conn.getresponse() 18:02:00 ^^^^^^^^^^^^^^^^^^ 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:534: 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connection.py:571: in getresponse 18:02:00 httplib_response = super().getresponse() 18:02:00 ^^^^^^^^^^^^^^^^^^^^^ 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1395: in getresponse 18:02:00 response.begin() 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:325: in begin 18:02:00 version, status, reason = self._read_status() 18:02:00 ^^^^^^^^^^^^^^^^^^^ 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:286: in _read_status 18:02:00 line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 18:02:00 self = 18:02:00 b = 18:02:00 18:02:00 def readinto(self, b): 18:02:00 """Read up to len(b) bytes into the writable buffer *b* and return 18:02:00 the number of bytes read. If the socket is non-blocking and no bytes 18:02:00 are available, None is returned. 18:02:00 18:02:00 If *b* is non-empty, a 0 return value indicates that the connection 18:02:00 was shutdown at the other end. 18:02:00 """ 18:02:00 self._checkClosed() 18:02:00 self._checkReadable() 18:02:00 if self._timeout_occurred: 18:02:00 raise OSError("cannot read from timed out object") 18:02:00 while True: 18:02:00 try: 18:02:00 > return self._sock.recv_into(b) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 E TimeoutError: timed out 18:02:00 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/socket.py:718: TimeoutError 18:02:00 18:02:00 The above exception was the direct cause of the following exception: 18:02:00 18:02:00 self = 18:02:00 request = , stream = False 18:02:00 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 18:02:00 proxies = OrderedDict() 18:02:00 18:02:00 def send( 18:02:00 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 18:02:00 ): 18:02:00 """Sends PreparedRequest object. Returns Response object. 18:02:00 18:02:00 :param request: The :class:`PreparedRequest ` being sent. 18:02:00 :param stream: (optional) Whether to stream the request content. 18:02:00 :param timeout: (optional) How long to wait for the server to send 18:02:00 data before giving up, as a float, or a :ref:`(connect timeout, 18:02:00 read timeout) ` tuple. 18:02:00 :type timeout: float or tuple or urllib3 Timeout object 18:02:00 :param verify: (optional) Either a boolean, in which case it controls whether 18:02:00 we verify the server's TLS certificate, or a string, in which case it 18:02:00 must be a path to a CA bundle to use 18:02:00 :param cert: (optional) Any user-provided SSL certificate to be trusted. 18:02:00 :param proxies: (optional) The proxies dictionary to apply to the request. 18:02:00 :rtype: requests.Response 18:02:00 """ 18:02:00 18:02:00 try: 18:02:00 conn = self.get_connection_with_tls_context( 18:02:00 request, verify, proxies=proxies, cert=cert 18:02:00 ) 18:02:00 except LocationValueError as e: 18:02:00 raise InvalidURL(e, request=request) 18:02:00 18:02:00 self.cert_verify(conn, request.url, verify, cert) 18:02:00 url = self.request_url(request, proxies) 18:02:00 self.add_headers( 18:02:00 request, 18:02:00 stream=stream, 18:02:00 timeout=timeout, 18:02:00 verify=verify, 18:02:00 cert=cert, 18:02:00 proxies=proxies, 18:02:00 ) 18:02:00 18:02:00 chunked = not (request.body is None or "Content-Length" in request.headers) 18:02:00 18:02:00 if isinstance(timeout, tuple): 18:02:00 try: 18:02:00 connect, read = timeout 18:02:00 timeout = TimeoutSauce(connect=connect, read=read) 18:02:00 except ValueError: 18:02:00 raise ValueError( 18:02:00 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 18:02:00 f"or a single float to set both timeouts to the same value." 18:02:00 ) 18:02:00 elif isinstance(timeout, TimeoutSauce): 18:02:00 pass 18:02:00 else: 18:02:00 timeout = TimeoutSauce(connect=timeout, read=timeout) 18:02:00 18:02:00 try: 18:02:00 > resp = conn.urlopen( 18:02:00 method=request.method, 18:02:00 url=url, 18:02:00 body=request.body, 18:02:00 headers=request.headers, 18:02:00 redirect=False, 18:02:00 assert_same_host=False, 18:02:00 preload_content=False, 18:02:00 decode_content=False, 18:02:00 retries=self.max_retries, 18:02:00 timeout=timeout, 18:02:00 chunked=chunked, 18:02:00 ) 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/adapters.py:644: 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 18:02:00 retries = retries.increment( 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/util/retry.py:490: in increment 18:02:00 raise reraise(type(error), error, _stacktrace) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/util/util.py:39: in reraise 18:02:00 raise value 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:787: in urlopen 18:02:00 response = self._make_request( 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:536: in _make_request 18:02:00 self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 18:02:00 self = 18:02:00 err = TimeoutError('timed out') 18:02:00 url = '/rests/operations/transportpce-device-renderer:service-path' 18:02:00 timeout_value = 30 18:02:00 18:02:00 def _raise_timeout( 18:02:00 self, 18:02:00 err: BaseSSLError | OSError | SocketTimeout, 18:02:00 url: str, 18:02:00 timeout_value: _TYPE_TIMEOUT | None, 18:02:00 ) -> None: 18:02:00 """Is the error actually a timeout? Will raise a ReadTimeout or pass""" 18:02:00 18:02:00 if isinstance(err, SocketTimeout): 18:02:00 > raise ReadTimeoutError( 18:02:00 self, url, f"Read timed out. (read timeout={timeout_value})" 18:02:00 ) from err 18:02:00 E urllib3.exceptions.ReadTimeoutError: HTTPConnectionPool(host='localhost', port=8184): Read timed out. (read timeout=30) 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:367: ReadTimeoutError 18:02:00 18:02:00 During handling of the above exception, another exception occurred: 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_26_service_path_create_otuc3(self): 18:02:00 > response = test_utils.transportpce_api_rpc_request( 18:02:00 'transportpce-device-renderer', 'service-path', 18:02:00 { 18:02:00 'service-name': 'service_OTUC3', 18:02:00 'wave-number': '0', 18:02:00 'modulation-format': 'dp-qam8', 18:02:00 'operation': 'create', 18:02:00 'nodes': [{'node-id': 'XPDR-A2', 'dest-tp': 'XPDR2-NETWORK1'}], 18:02:00 'center-freq': 196.1, 18:02:00 'nmc-width': 75, 18:02:00 'min-freq': 196.0375, 18:02:00 'max-freq': 196.125, 18:02:00 'lower-spectral-slot-number': 755, 18:02:00 'higher-spectral-slot-number': 768 18:02:00 }) 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:447: 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 transportpce_tests/common/test_utils.py:751: in transportpce_api_rpc_request 18:02:00 response = post_request(url, data) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 transportpce_tests/common/test_utils.py:143: in post_request 18:02:00 return requests.request( 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/api.py:59: in request 18:02:00 return session.request(method=method, url=url, **kwargs) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/sessions.py:589: in request 18:02:00 resp = self.send(prep, **send_kwargs) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/sessions.py:703: in send 18:02:00 r = adapter.send(request, **kwargs) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 18:02:00 self = 18:02:00 request = , stream = False 18:02:00 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 18:02:00 proxies = OrderedDict() 18:02:00 18:02:00 def send( 18:02:00 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 18:02:00 ): 18:02:00 """Sends PreparedRequest object. Returns Response object. 18:02:00 18:02:00 :param request: The :class:`PreparedRequest ` being sent. 18:02:00 :param stream: (optional) Whether to stream the request content. 18:02:00 :param timeout: (optional) How long to wait for the server to send 18:02:00 data before giving up, as a float, or a :ref:`(connect timeout, 18:02:00 read timeout) ` tuple. 18:02:00 :type timeout: float or tuple or urllib3 Timeout object 18:02:00 :param verify: (optional) Either a boolean, in which case it controls whether 18:02:00 we verify the server's TLS certificate, or a string, in which case it 18:02:00 must be a path to a CA bundle to use 18:02:00 :param cert: (optional) Any user-provided SSL certificate to be trusted. 18:02:00 :param proxies: (optional) The proxies dictionary to apply to the request. 18:02:00 :rtype: requests.Response 18:02:00 """ 18:02:00 18:02:00 try: 18:02:00 conn = self.get_connection_with_tls_context( 18:02:00 request, verify, proxies=proxies, cert=cert 18:02:00 ) 18:02:00 except LocationValueError as e: 18:02:00 raise InvalidURL(e, request=request) 18:02:00 18:02:00 self.cert_verify(conn, request.url, verify, cert) 18:02:00 url = self.request_url(request, proxies) 18:02:00 self.add_headers( 18:02:00 request, 18:02:00 stream=stream, 18:02:00 timeout=timeout, 18:02:00 verify=verify, 18:02:00 cert=cert, 18:02:00 proxies=proxies, 18:02:00 ) 18:02:00 18:02:00 chunked = not (request.body is None or "Content-Length" in request.headers) 18:02:00 18:02:00 if isinstance(timeout, tuple): 18:02:00 try: 18:02:00 connect, read = timeout 18:02:00 timeout = TimeoutSauce(connect=connect, read=read) 18:02:00 except ValueError: 18:02:00 raise ValueError( 18:02:00 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 18:02:00 f"or a single float to set both timeouts to the same value." 18:02:00 ) 18:02:00 elif isinstance(timeout, TimeoutSauce): 18:02:00 pass 18:02:00 else: 18:02:00 timeout = TimeoutSauce(connect=timeout, read=timeout) 18:02:00 18:02:00 try: 18:02:00 resp = conn.urlopen( 18:02:00 method=request.method, 18:02:00 url=url, 18:02:00 body=request.body, 18:02:00 headers=request.headers, 18:02:00 redirect=False, 18:02:00 assert_same_host=False, 18:02:00 preload_content=False, 18:02:00 decode_content=False, 18:02:00 retries=self.max_retries, 18:02:00 timeout=timeout, 18:02:00 chunked=chunked, 18:02:00 ) 18:02:00 18:02:00 except (ProtocolError, OSError) as err: 18:02:00 raise ConnectionError(err, request=request) 18:02:00 18:02:00 except MaxRetryError as e: 18:02:00 if isinstance(e.reason, ConnectTimeoutError): 18:02:00 # TODO: Remove this in 3.0.0: see #2811 18:02:00 if not isinstance(e.reason, NewConnectionError): 18:02:00 raise ConnectTimeout(e, request=request) 18:02:00 18:02:00 if isinstance(e.reason, ResponseError): 18:02:00 raise RetryError(e, request=request) 18:02:00 18:02:00 if isinstance(e.reason, _ProxyError): 18:02:00 raise ProxyError(e, request=request) 18:02:00 18:02:00 if isinstance(e.reason, _SSLError): 18:02:00 # This branch is for urllib3 v1.22 and later. 18:02:00 raise SSLError(e, request=request) 18:02:00 18:02:00 raise ConnectionError(e, request=request) 18:02:00 18:02:00 except ClosedPoolError as e: 18:02:00 raise ConnectionError(e, request=request) 18:02:00 18:02:00 except _ProxyError as e: 18:02:00 raise ProxyError(e) 18:02:00 18:02:00 except (_SSLError, _HTTPError) as e: 18:02:00 if isinstance(e, _SSLError): 18:02:00 # This branch is for urllib3 versions earlier than v1.22 18:02:00 raise SSLError(e, request=request) 18:02:00 elif isinstance(e, ReadTimeoutError): 18:02:00 > raise ReadTimeout(e, request=request) 18:02:00 E requests.exceptions.ReadTimeout: HTTPConnectionPool(host='localhost', port=8184): Read timed out. (read timeout=30) 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/adapters.py:690: ReadTimeout 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_26_service_path_create_otuc3 18:02:00 _________ TestTransportPCEOtnRenderer.test_27_get_portmapping_network1 _________ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_27_get_portmapping_network1(self): 18:02:00 response = test_utils.get_portmapping_node_attr("XPDR-A2", "mapping", "XPDR2-NETWORK1") 18:02:00 self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 self.NETWORK2_CHECK_DICT["supporting-otucn"] = "XPDR2-NETWORK1-OTUC3" 18:02:00 expected_sorted = test_utils.recursive_sort(self.NETWORK2_CHECK_DICT) 18:02:00 response_sorted = [ 18:02:00 test_utils.recursive_sort(item) for item in response['mapping'] 18:02:00 ] 18:02:00 > self.assertIn(expected_sorted, response_sorted) 18:02:00 E AssertionError: {'lcp-hash-val': 'LY9PxYJqUbw=', 'logical-connection-point': 'XPDR2-NETWORK1', 'port-admin-state': 'InService', 'port-direction': 'bidirectional', 'port-oper-state': 'InService', 'port-qual': 'switch-network', 'rate': '200', 'supported-interface-capability': ['org-openroadm-port-types:if-otsi-otsigroup'], 'supported-operational-mode': ['OR-W-100G-oFEC-31.6Gbd', 'OR-W-200G-oFEC-31.6Gbd'], 'supporting-circuit-pack-name': '1/2/2-PLUG-NET', 'supporting-otucn': 'XPDR2-NETWORK1-OTUC3', 'supporting-port': 'L1', 'xpdr-type': 'mpdr'} not found in [{'lcp-hash-val': 'LY9PxYJqUbw=', 'logical-connection-point': 'XPDR2-NETWORK1', 'port-admin-state': 'InService', 'port-direction': 'bidirectional', 'port-oper-state': 'InService', 'port-qual': 'switch-network', 'rate': '200', 'supported-interface-capability': ['org-openroadm-port-types:if-otsi-otsigroup'], 'supported-operational-mode': ['OR-W-100G-oFEC-31.6Gbd', 'OR-W-200G-oFEC-31.6Gbd'], 'supporting-circuit-pack-name': '1/2/2-PLUG-NET', 'supporting-port': 'L1', 'xpdr-type': 'mpdr'}] 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:481: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_27_get_portmapping_network1 18:02:00 ___________ TestTransportPCEOtnRenderer.test_28_check_interface_otsi ___________ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_28_check_interface_otsi(self): 18:02:00 # pylint: disable=line-too-long 18:02:00 response = test_utils.check_node_attribute_request("XPDR-A2", "interface", "XPDR2-NETWORK1-755:768") 18:02:00 > self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 E AssertionError: 409 != 200 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:486: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_28_check_interface_otsi 18:02:00 __________ TestTransportPCEOtnRenderer.test_29_check_interface_otsig ___________ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_29_check_interface_otsig(self): 18:02:00 response = test_utils.check_node_attribute_request( 18:02:00 "XPDR-A2", "interface", "XPDR2-NETWORK1-OTSIGROUP-300G") 18:02:00 > self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 E AssertionError: 409 != 200 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:512: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_29_check_interface_otsig 18:02:00 __________ TestTransportPCEOtnRenderer.test_30_check_interface_otuc3 ___________ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_30_check_interface_otuc3(self): 18:02:00 response = test_utils.check_node_attribute_request( 18:02:00 "XPDR-A2", "interface", "XPDR2-NETWORK1-OTUC3") 18:02:00 > self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 E AssertionError: 409 != 200 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:531: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_30_check_interface_otuc3 18:02:00 ______ TestTransportPCEOtnRenderer.test_31_otn_service_path_create_oduc3 _______ 18:02:00 18:02:00 self = 18:02:00 conn = 18:02:00 method = 'POST' 18:02:00 url = '/rests/operations/transportpce-device-renderer:otn-service-path' 18:02:00 body = '{"input": {"service-name": "service_ODUC3", "operation": "create", "service-rate": "300", "service-format": "ODU", "nodes": [{"node-id": "XPDR-A2", "network-tp": "XPDR2-NETWORK1"}]}}' 18:02:00 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '182', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 18:02:00 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 18:02:00 timeout = Timeout(connect=30, read=30, total=None), chunked = False 18:02:00 response_conn = 18:02:00 preload_content = False, decode_content = False, enforce_content_length = True 18:02:00 18:02:00 def _make_request( 18:02:00 self, 18:02:00 conn: BaseHTTPConnection, 18:02:00 method: str, 18:02:00 url: str, 18:02:00 body: _TYPE_BODY | None = None, 18:02:00 headers: typing.Mapping[str, str] | None = None, 18:02:00 retries: Retry | None = None, 18:02:00 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 18:02:00 chunked: bool = False, 18:02:00 response_conn: BaseHTTPConnection | None = None, 18:02:00 preload_content: bool = True, 18:02:00 decode_content: bool = True, 18:02:00 enforce_content_length: bool = True, 18:02:00 ) -> BaseHTTPResponse: 18:02:00 """ 18:02:00 Perform a request on a given urllib connection object taken from our 18:02:00 pool. 18:02:00 18:02:00 :param conn: 18:02:00 a connection from one of our connection pools 18:02:00 18:02:00 :param method: 18:02:00 HTTP request method (such as GET, POST, PUT, etc.) 18:02:00 18:02:00 :param url: 18:02:00 The URL to perform the request on. 18:02:00 18:02:00 :param body: 18:02:00 Data to send in the request body, either :class:`str`, :class:`bytes`, 18:02:00 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 18:02:00 18:02:00 :param headers: 18:02:00 Dictionary of custom headers to send, such as User-Agent, 18:02:00 If-None-Match, etc. If None, pool headers are used. If provided, 18:02:00 these headers completely replace any pool-specific headers. 18:02:00 18:02:00 :param retries: 18:02:00 Configure the number of retries to allow before raising a 18:02:00 :class:`~urllib3.exceptions.MaxRetryError` exception. 18:02:00 18:02:00 Pass ``None`` to retry until you receive a response. Pass a 18:02:00 :class:`~urllib3.util.retry.Retry` object for fine-grained control 18:02:00 over different types of retries. 18:02:00 Pass an integer number to retry connection errors that many times, 18:02:00 but no other types of errors. Pass zero to never retry. 18:02:00 18:02:00 If ``False``, then retries are disabled and any exception is raised 18:02:00 immediately. Also, instead of raising a MaxRetryError on redirects, 18:02:00 the redirect response will be returned. 18:02:00 18:02:00 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 18:02:00 18:02:00 :param timeout: 18:02:00 If specified, overrides the default timeout for this one 18:02:00 request. It may be a float (in seconds) or an instance of 18:02:00 :class:`urllib3.util.Timeout`. 18:02:00 18:02:00 :param chunked: 18:02:00 If True, urllib3 will send the body using chunked transfer 18:02:00 encoding. Otherwise, urllib3 will send the body using the standard 18:02:00 content-length form. Defaults to False. 18:02:00 18:02:00 :param response_conn: 18:02:00 Set this to ``None`` if you will handle releasing the connection or 18:02:00 set the connection to have the response release it. 18:02:00 18:02:00 :param preload_content: 18:02:00 If True, the response's body will be preloaded during construction. 18:02:00 18:02:00 :param decode_content: 18:02:00 If True, will attempt to decode the body based on the 18:02:00 'content-encoding' header. 18:02:00 18:02:00 :param enforce_content_length: 18:02:00 Enforce content length checking. Body returned by server must match 18:02:00 value of Content-Length header, if present. Otherwise, raise error. 18:02:00 """ 18:02:00 self.num_requests += 1 18:02:00 18:02:00 timeout_obj = self._get_timeout(timeout) 18:02:00 timeout_obj.start_connect() 18:02:00 conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 18:02:00 18:02:00 try: 18:02:00 # Trigger any extra validation we need to do. 18:02:00 try: 18:02:00 self._validate_conn(conn) 18:02:00 except (SocketTimeout, BaseSSLError) as e: 18:02:00 self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 18:02:00 raise 18:02:00 18:02:00 # _validate_conn() starts the connection to an HTTPS proxy 18:02:00 # so we need to wrap errors with 'ProxyError' here too. 18:02:00 except ( 18:02:00 OSError, 18:02:00 NewConnectionError, 18:02:00 TimeoutError, 18:02:00 BaseSSLError, 18:02:00 CertificateError, 18:02:00 SSLError, 18:02:00 ) as e: 18:02:00 new_e: Exception = e 18:02:00 if isinstance(e, (BaseSSLError, CertificateError)): 18:02:00 new_e = SSLError(e) 18:02:00 # If the connection didn't successfully connect to it's proxy 18:02:00 # then there 18:02:00 if isinstance( 18:02:00 new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 18:02:00 ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 18:02:00 new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 18:02:00 raise new_e 18:02:00 18:02:00 # conn.request() calls http.client.*.request, not the method in 18:02:00 # urllib3.request. It also calls makefile (recv) on the socket. 18:02:00 try: 18:02:00 conn.request( 18:02:00 method, 18:02:00 url, 18:02:00 body=body, 18:02:00 headers=headers, 18:02:00 chunked=chunked, 18:02:00 preload_content=preload_content, 18:02:00 decode_content=decode_content, 18:02:00 enforce_content_length=enforce_content_length, 18:02:00 ) 18:02:00 18:02:00 # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 18:02:00 # legitimately able to close the connection after sending a valid response. 18:02:00 # With this behaviour, the received response is still readable. 18:02:00 except BrokenPipeError: 18:02:00 pass 18:02:00 except OSError as e: 18:02:00 # MacOS/Linux 18:02:00 # EPROTOTYPE and ECONNRESET are needed on macOS 18:02:00 # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 18:02:00 # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 18:02:00 if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 18:02:00 raise 18:02:00 18:02:00 # Reset the timeout for the recv() on the socket 18:02:00 read_timeout = timeout_obj.read_timeout 18:02:00 18:02:00 if not conn.is_closed: 18:02:00 # In Python 3 socket.py will catch EAGAIN and return None when you 18:02:00 # try and read into the file pointer created by http.client, which 18:02:00 # instead raises a BadStatusLine exception. Instead of catching 18:02:00 # the exception and assuming all BadStatusLine exceptions are read 18:02:00 # timeouts, check for a zero timeout before making the request. 18:02:00 if read_timeout == 0: 18:02:00 raise ReadTimeoutError( 18:02:00 self, url, f"Read timed out. (read timeout={read_timeout})" 18:02:00 ) 18:02:00 conn.timeout = read_timeout 18:02:00 18:02:00 # Receive the response from the server 18:02:00 try: 18:02:00 > response = conn.getresponse() 18:02:00 ^^^^^^^^^^^^^^^^^^ 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:534: 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connection.py:571: in getresponse 18:02:00 httplib_response = super().getresponse() 18:02:00 ^^^^^^^^^^^^^^^^^^^^^ 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1395: in getresponse 18:02:00 response.begin() 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:325: in begin 18:02:00 version, status, reason = self._read_status() 18:02:00 ^^^^^^^^^^^^^^^^^^^ 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:286: in _read_status 18:02:00 line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 18:02:00 self = 18:02:00 b = 18:02:00 18:02:00 def readinto(self, b): 18:02:00 """Read up to len(b) bytes into the writable buffer *b* and return 18:02:00 the number of bytes read. If the socket is non-blocking and no bytes 18:02:00 are available, None is returned. 18:02:00 18:02:00 If *b* is non-empty, a 0 return value indicates that the connection 18:02:00 was shutdown at the other end. 18:02:00 """ 18:02:00 self._checkClosed() 18:02:00 self._checkReadable() 18:02:00 if self._timeout_occurred: 18:02:00 raise OSError("cannot read from timed out object") 18:02:00 while True: 18:02:00 try: 18:02:00 > return self._sock.recv_into(b) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 E TimeoutError: timed out 18:02:00 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/socket.py:718: TimeoutError 18:02:00 18:02:00 The above exception was the direct cause of the following exception: 18:02:00 18:02:00 self = 18:02:00 request = , stream = False 18:02:00 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 18:02:00 proxies = OrderedDict() 18:02:00 18:02:00 def send( 18:02:00 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 18:02:00 ): 18:02:00 """Sends PreparedRequest object. Returns Response object. 18:02:00 18:02:00 :param request: The :class:`PreparedRequest ` being sent. 18:02:00 :param stream: (optional) Whether to stream the request content. 18:02:00 :param timeout: (optional) How long to wait for the server to send 18:02:00 data before giving up, as a float, or a :ref:`(connect timeout, 18:02:00 read timeout) ` tuple. 18:02:00 :type timeout: float or tuple or urllib3 Timeout object 18:02:00 :param verify: (optional) Either a boolean, in which case it controls whether 18:02:00 we verify the server's TLS certificate, or a string, in which case it 18:02:00 must be a path to a CA bundle to use 18:02:00 :param cert: (optional) Any user-provided SSL certificate to be trusted. 18:02:00 :param proxies: (optional) The proxies dictionary to apply to the request. 18:02:00 :rtype: requests.Response 18:02:00 """ 18:02:00 18:02:00 try: 18:02:00 conn = self.get_connection_with_tls_context( 18:02:00 request, verify, proxies=proxies, cert=cert 18:02:00 ) 18:02:00 except LocationValueError as e: 18:02:00 raise InvalidURL(e, request=request) 18:02:00 18:02:00 self.cert_verify(conn, request.url, verify, cert) 18:02:00 url = self.request_url(request, proxies) 18:02:00 self.add_headers( 18:02:00 request, 18:02:00 stream=stream, 18:02:00 timeout=timeout, 18:02:00 verify=verify, 18:02:00 cert=cert, 18:02:00 proxies=proxies, 18:02:00 ) 18:02:00 18:02:00 chunked = not (request.body is None or "Content-Length" in request.headers) 18:02:00 18:02:00 if isinstance(timeout, tuple): 18:02:00 try: 18:02:00 connect, read = timeout 18:02:00 timeout = TimeoutSauce(connect=connect, read=read) 18:02:00 except ValueError: 18:02:00 raise ValueError( 18:02:00 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 18:02:00 f"or a single float to set both timeouts to the same value." 18:02:00 ) 18:02:00 elif isinstance(timeout, TimeoutSauce): 18:02:00 pass 18:02:00 else: 18:02:00 timeout = TimeoutSauce(connect=timeout, read=timeout) 18:02:00 18:02:00 try: 18:02:00 > resp = conn.urlopen( 18:02:00 method=request.method, 18:02:00 url=url, 18:02:00 body=request.body, 18:02:00 headers=request.headers, 18:02:00 redirect=False, 18:02:00 assert_same_host=False, 18:02:00 preload_content=False, 18:02:00 decode_content=False, 18:02:00 retries=self.max_retries, 18:02:00 timeout=timeout, 18:02:00 chunked=chunked, 18:02:00 ) 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/adapters.py:644: 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 18:02:00 retries = retries.increment( 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/util/retry.py:490: in increment 18:02:00 raise reraise(type(error), error, _stacktrace) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/util/util.py:39: in reraise 18:02:00 raise value 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:787: in urlopen 18:02:00 response = self._make_request( 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:536: in _make_request 18:02:00 self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 18:02:00 self = 18:02:00 err = TimeoutError('timed out') 18:02:00 url = '/rests/operations/transportpce-device-renderer:otn-service-path' 18:02:00 timeout_value = 30 18:02:00 18:02:00 def _raise_timeout( 18:02:00 self, 18:02:00 err: BaseSSLError | OSError | SocketTimeout, 18:02:00 url: str, 18:02:00 timeout_value: _TYPE_TIMEOUT | None, 18:02:00 ) -> None: 18:02:00 """Is the error actually a timeout? Will raise a ReadTimeout or pass""" 18:02:00 18:02:00 if isinstance(err, SocketTimeout): 18:02:00 > raise ReadTimeoutError( 18:02:00 self, url, f"Read timed out. (read timeout={timeout_value})" 18:02:00 ) from err 18:02:00 E urllib3.exceptions.ReadTimeoutError: HTTPConnectionPool(host='localhost', port=8184): Read timed out. (read timeout=30) 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:367: ReadTimeoutError 18:02:00 18:02:00 During handling of the above exception, another exception occurred: 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_31_otn_service_path_create_oduc3(self): 18:02:00 > response = test_utils.transportpce_api_rpc_request( 18:02:00 'transportpce-device-renderer', 'otn-service-path', 18:02:00 { 18:02:00 'service-name': 'service_ODUC3', 18:02:00 'operation': 'create', 18:02:00 'service-rate': '300', 18:02:00 'service-format': 'ODU', 18:02:00 'nodes': [{'node-id': 'XPDR-A2', 'network-tp': 'XPDR2-NETWORK1'}] 18:02:00 }) 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:552: 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 transportpce_tests/common/test_utils.py:751: in transportpce_api_rpc_request 18:02:00 response = post_request(url, data) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 transportpce_tests/common/test_utils.py:143: in post_request 18:02:00 return requests.request( 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/api.py:59: in request 18:02:00 return session.request(method=method, url=url, **kwargs) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/sessions.py:589: in request 18:02:00 resp = self.send(prep, **send_kwargs) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/sessions.py:703: in send 18:02:00 r = adapter.send(request, **kwargs) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 18:02:00 self = 18:02:00 request = , stream = False 18:02:00 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 18:02:00 proxies = OrderedDict() 18:02:00 18:02:00 def send( 18:02:00 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 18:02:00 ): 18:02:00 """Sends PreparedRequest object. Returns Response object. 18:02:00 18:02:00 :param request: The :class:`PreparedRequest ` being sent. 18:02:00 :param stream: (optional) Whether to stream the request content. 18:02:00 :param timeout: (optional) How long to wait for the server to send 18:02:00 data before giving up, as a float, or a :ref:`(connect timeout, 18:02:00 read timeout) ` tuple. 18:02:00 :type timeout: float or tuple or urllib3 Timeout object 18:02:00 :param verify: (optional) Either a boolean, in which case it controls whether 18:02:00 we verify the server's TLS certificate, or a string, in which case it 18:02:00 must be a path to a CA bundle to use 18:02:00 :param cert: (optional) Any user-provided SSL certificate to be trusted. 18:02:00 :param proxies: (optional) The proxies dictionary to apply to the request. 18:02:00 :rtype: requests.Response 18:02:00 """ 18:02:00 18:02:00 try: 18:02:00 conn = self.get_connection_with_tls_context( 18:02:00 request, verify, proxies=proxies, cert=cert 18:02:00 ) 18:02:00 except LocationValueError as e: 18:02:00 raise InvalidURL(e, request=request) 18:02:00 18:02:00 self.cert_verify(conn, request.url, verify, cert) 18:02:00 url = self.request_url(request, proxies) 18:02:00 self.add_headers( 18:02:00 request, 18:02:00 stream=stream, 18:02:00 timeout=timeout, 18:02:00 verify=verify, 18:02:00 cert=cert, 18:02:00 proxies=proxies, 18:02:00 ) 18:02:00 18:02:00 chunked = not (request.body is None or "Content-Length" in request.headers) 18:02:00 18:02:00 if isinstance(timeout, tuple): 18:02:00 try: 18:02:00 connect, read = timeout 18:02:00 timeout = TimeoutSauce(connect=connect, read=read) 18:02:00 except ValueError: 18:02:00 raise ValueError( 18:02:00 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 18:02:00 f"or a single float to set both timeouts to the same value." 18:02:00 ) 18:02:00 elif isinstance(timeout, TimeoutSauce): 18:02:00 pass 18:02:00 else: 18:02:00 timeout = TimeoutSauce(connect=timeout, read=timeout) 18:02:00 18:02:00 try: 18:02:00 resp = conn.urlopen( 18:02:00 method=request.method, 18:02:00 url=url, 18:02:00 body=request.body, 18:02:00 headers=request.headers, 18:02:00 redirect=False, 18:02:00 assert_same_host=False, 18:02:00 preload_content=False, 18:02:00 decode_content=False, 18:02:00 retries=self.max_retries, 18:02:00 timeout=timeout, 18:02:00 chunked=chunked, 18:02:00 ) 18:02:00 18:02:00 except (ProtocolError, OSError) as err: 18:02:00 raise ConnectionError(err, request=request) 18:02:00 18:02:00 except MaxRetryError as e: 18:02:00 if isinstance(e.reason, ConnectTimeoutError): 18:02:00 # TODO: Remove this in 3.0.0: see #2811 18:02:00 if not isinstance(e.reason, NewConnectionError): 18:02:00 raise ConnectTimeout(e, request=request) 18:02:00 18:02:00 if isinstance(e.reason, ResponseError): 18:02:00 raise RetryError(e, request=request) 18:02:00 18:02:00 if isinstance(e.reason, _ProxyError): 18:02:00 raise ProxyError(e, request=request) 18:02:00 18:02:00 if isinstance(e.reason, _SSLError): 18:02:00 # This branch is for urllib3 v1.22 and later. 18:02:00 raise SSLError(e, request=request) 18:02:00 18:02:00 raise ConnectionError(e, request=request) 18:02:00 18:02:00 except ClosedPoolError as e: 18:02:00 raise ConnectionError(e, request=request) 18:02:00 18:02:00 except _ProxyError as e: 18:02:00 raise ProxyError(e) 18:02:00 18:02:00 except (_SSLError, _HTTPError) as e: 18:02:00 if isinstance(e, _SSLError): 18:02:00 # This branch is for urllib3 versions earlier than v1.22 18:02:00 raise SSLError(e, request=request) 18:02:00 elif isinstance(e, ReadTimeoutError): 18:02:00 > raise ReadTimeout(e, request=request) 18:02:00 E requests.exceptions.ReadTimeout: HTTPConnectionPool(host='localhost', port=8184): Read timed out. (read timeout=30) 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/adapters.py:690: ReadTimeout 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_31_otn_service_path_create_oduc3 18:02:00 _________ TestTransportPCEOtnRenderer.test_32_get_portmapping_network1 _________ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_32_get_portmapping_network1(self): 18:02:00 response = test_utils.get_portmapping_node_attr("XPDR-A2", "mapping", "XPDR2-NETWORK1") 18:02:00 self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 self.NETWORK2_CHECK_DICT["supporting-oducn"] = "XPDR2-NETWORK1-ODUC3" 18:02:00 expected_sorted = test_utils.recursive_sort(self.NETWORK2_CHECK_DICT) 18:02:00 response_sorted = [ 18:02:00 test_utils.recursive_sort(item) for item in response['mapping'] 18:02:00 ] 18:02:00 > self.assertIn(expected_sorted, response_sorted) 18:02:00 E AssertionError: {'lcp-hash-val': 'LY9PxYJqUbw=', 'logical-connection-point': 'XPDR2-NETWORK1', 'port-admin-state': 'InService', 'port-direction': 'bidirectional', 'port-oper-state': 'InService', 'port-qual': 'switch-network', 'rate': '200', 'supported-interface-capability': ['org-openroadm-port-types:if-otsi-otsigroup'], 'supported-operational-mode': ['OR-W-100G-oFEC-31.6Gbd', 'OR-W-200G-oFEC-31.6Gbd'], 'supporting-circuit-pack-name': '1/2/2-PLUG-NET', 'supporting-oducn': 'XPDR2-NETWORK1-ODUC3', 'supporting-otucn': 'XPDR2-NETWORK1-OTUC3', 'supporting-port': 'L1', 'xpdr-type': 'mpdr'} not found in [{'lcp-hash-val': 'LY9PxYJqUbw=', 'logical-connection-point': 'XPDR2-NETWORK1', 'port-admin-state': 'InService', 'port-direction': 'bidirectional', 'port-oper-state': 'InService', 'port-qual': 'switch-network', 'rate': '200', 'supported-interface-capability': ['org-openroadm-port-types:if-otsi-otsigroup'], 'supported-operational-mode': ['OR-W-100G-oFEC-31.6Gbd', 'OR-W-200G-oFEC-31.6Gbd'], 'supporting-circuit-pack-name': '1/2/2-PLUG-NET', 'supporting-port': 'L1', 'xpdr-type': 'mpdr'}] 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:575: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_32_get_portmapping_network1 18:02:00 __________ TestTransportPCEOtnRenderer.test_33_check_interface_oduc3 ___________ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_33_check_interface_oduc3(self): 18:02:00 response = test_utils.check_node_attribute_request("XPDR-A2", "interface", "XPDR2-NETWORK1-ODUC3") 18:02:00 > self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 E AssertionError: 409 != 200 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:579: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_33_check_interface_oduc3 18:02:00 ________ TestTransportPCEOtnRenderer.test_40_service_path_create_otuc4 _________ 18:02:00 18:02:00 self = 18:02:00 conn = 18:02:00 method = 'POST' 18:02:00 url = '/rests/operations/transportpce-device-renderer:service-path' 18:02:00 body = '{"input": {"service-name": "service_OTUC4", "wave-number": "0", "modulation-format": "dp-qam16", "operation": "create...75, "min-freq": 196.0375, "max-freq": 196.125, "lower-spectral-slot-number": 755, "higher-spectral-slot-number": 768}}' 18:02:00 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '337', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 18:02:00 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 18:02:00 timeout = Timeout(connect=30, read=30, total=None), chunked = False 18:02:00 response_conn = 18:02:00 preload_content = False, decode_content = False, enforce_content_length = True 18:02:00 18:02:00 def _make_request( 18:02:00 self, 18:02:00 conn: BaseHTTPConnection, 18:02:00 method: str, 18:02:00 url: str, 18:02:00 body: _TYPE_BODY | None = None, 18:02:00 headers: typing.Mapping[str, str] | None = None, 18:02:00 retries: Retry | None = None, 18:02:00 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 18:02:00 chunked: bool = False, 18:02:00 response_conn: BaseHTTPConnection | None = None, 18:02:00 preload_content: bool = True, 18:02:00 decode_content: bool = True, 18:02:00 enforce_content_length: bool = True, 18:02:00 ) -> BaseHTTPResponse: 18:02:00 """ 18:02:00 Perform a request on a given urllib connection object taken from our 18:02:00 pool. 18:02:00 18:02:00 :param conn: 18:02:00 a connection from one of our connection pools 18:02:00 18:02:00 :param method: 18:02:00 HTTP request method (such as GET, POST, PUT, etc.) 18:02:00 18:02:00 :param url: 18:02:00 The URL to perform the request on. 18:02:00 18:02:00 :param body: 18:02:00 Data to send in the request body, either :class:`str`, :class:`bytes`, 18:02:00 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 18:02:00 18:02:00 :param headers: 18:02:00 Dictionary of custom headers to send, such as User-Agent, 18:02:00 If-None-Match, etc. If None, pool headers are used. If provided, 18:02:00 these headers completely replace any pool-specific headers. 18:02:00 18:02:00 :param retries: 18:02:00 Configure the number of retries to allow before raising a 18:02:00 :class:`~urllib3.exceptions.MaxRetryError` exception. 18:02:00 18:02:00 Pass ``None`` to retry until you receive a response. Pass a 18:02:00 :class:`~urllib3.util.retry.Retry` object for fine-grained control 18:02:00 over different types of retries. 18:02:00 Pass an integer number to retry connection errors that many times, 18:02:00 but no other types of errors. Pass zero to never retry. 18:02:00 18:02:00 If ``False``, then retries are disabled and any exception is raised 18:02:00 immediately. Also, instead of raising a MaxRetryError on redirects, 18:02:00 the redirect response will be returned. 18:02:00 18:02:00 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 18:02:00 18:02:00 :param timeout: 18:02:00 If specified, overrides the default timeout for this one 18:02:00 request. It may be a float (in seconds) or an instance of 18:02:00 :class:`urllib3.util.Timeout`. 18:02:00 18:02:00 :param chunked: 18:02:00 If True, urllib3 will send the body using chunked transfer 18:02:00 encoding. Otherwise, urllib3 will send the body using the standard 18:02:00 content-length form. Defaults to False. 18:02:00 18:02:00 :param response_conn: 18:02:00 Set this to ``None`` if you will handle releasing the connection or 18:02:00 set the connection to have the response release it. 18:02:00 18:02:00 :param preload_content: 18:02:00 If True, the response's body will be preloaded during construction. 18:02:00 18:02:00 :param decode_content: 18:02:00 If True, will attempt to decode the body based on the 18:02:00 'content-encoding' header. 18:02:00 18:02:00 :param enforce_content_length: 18:02:00 Enforce content length checking. Body returned by server must match 18:02:00 value of Content-Length header, if present. Otherwise, raise error. 18:02:00 """ 18:02:00 self.num_requests += 1 18:02:00 18:02:00 timeout_obj = self._get_timeout(timeout) 18:02:00 timeout_obj.start_connect() 18:02:00 conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 18:02:00 18:02:00 try: 18:02:00 # Trigger any extra validation we need to do. 18:02:00 try: 18:02:00 self._validate_conn(conn) 18:02:00 except (SocketTimeout, BaseSSLError) as e: 18:02:00 self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 18:02:00 raise 18:02:00 18:02:00 # _validate_conn() starts the connection to an HTTPS proxy 18:02:00 # so we need to wrap errors with 'ProxyError' here too. 18:02:00 except ( 18:02:00 OSError, 18:02:00 NewConnectionError, 18:02:00 TimeoutError, 18:02:00 BaseSSLError, 18:02:00 CertificateError, 18:02:00 SSLError, 18:02:00 ) as e: 18:02:00 new_e: Exception = e 18:02:00 if isinstance(e, (BaseSSLError, CertificateError)): 18:02:00 new_e = SSLError(e) 18:02:00 # If the connection didn't successfully connect to it's proxy 18:02:00 # then there 18:02:00 if isinstance( 18:02:00 new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 18:02:00 ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 18:02:00 new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 18:02:00 raise new_e 18:02:00 18:02:00 # conn.request() calls http.client.*.request, not the method in 18:02:00 # urllib3.request. It also calls makefile (recv) on the socket. 18:02:00 try: 18:02:00 conn.request( 18:02:00 method, 18:02:00 url, 18:02:00 body=body, 18:02:00 headers=headers, 18:02:00 chunked=chunked, 18:02:00 preload_content=preload_content, 18:02:00 decode_content=decode_content, 18:02:00 enforce_content_length=enforce_content_length, 18:02:00 ) 18:02:00 18:02:00 # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 18:02:00 # legitimately able to close the connection after sending a valid response. 18:02:00 # With this behaviour, the received response is still readable. 18:02:00 except BrokenPipeError: 18:02:00 pass 18:02:00 except OSError as e: 18:02:00 # MacOS/Linux 18:02:00 # EPROTOTYPE and ECONNRESET are needed on macOS 18:02:00 # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 18:02:00 # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 18:02:00 if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 18:02:00 raise 18:02:00 18:02:00 # Reset the timeout for the recv() on the socket 18:02:00 read_timeout = timeout_obj.read_timeout 18:02:00 18:02:00 if not conn.is_closed: 18:02:00 # In Python 3 socket.py will catch EAGAIN and return None when you 18:02:00 # try and read into the file pointer created by http.client, which 18:02:00 # instead raises a BadStatusLine exception. Instead of catching 18:02:00 # the exception and assuming all BadStatusLine exceptions are read 18:02:00 # timeouts, check for a zero timeout before making the request. 18:02:00 if read_timeout == 0: 18:02:00 raise ReadTimeoutError( 18:02:00 self, url, f"Read timed out. (read timeout={read_timeout})" 18:02:00 ) 18:02:00 conn.timeout = read_timeout 18:02:00 18:02:00 # Receive the response from the server 18:02:00 try: 18:02:00 > response = conn.getresponse() 18:02:00 ^^^^^^^^^^^^^^^^^^ 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:534: 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connection.py:571: in getresponse 18:02:00 httplib_response = super().getresponse() 18:02:00 ^^^^^^^^^^^^^^^^^^^^^ 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1395: in getresponse 18:02:00 response.begin() 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:325: in begin 18:02:00 version, status, reason = self._read_status() 18:02:00 ^^^^^^^^^^^^^^^^^^^ 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:286: in _read_status 18:02:00 line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 18:02:00 self = 18:02:00 b = 18:02:00 18:02:00 def readinto(self, b): 18:02:00 """Read up to len(b) bytes into the writable buffer *b* and return 18:02:00 the number of bytes read. If the socket is non-blocking and no bytes 18:02:00 are available, None is returned. 18:02:00 18:02:00 If *b* is non-empty, a 0 return value indicates that the connection 18:02:00 was shutdown at the other end. 18:02:00 """ 18:02:00 self._checkClosed() 18:02:00 self._checkReadable() 18:02:00 if self._timeout_occurred: 18:02:00 raise OSError("cannot read from timed out object") 18:02:00 while True: 18:02:00 try: 18:02:00 > return self._sock.recv_into(b) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 E TimeoutError: timed out 18:02:00 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/socket.py:718: TimeoutError 18:02:00 18:02:00 The above exception was the direct cause of the following exception: 18:02:00 18:02:00 self = 18:02:00 request = , stream = False 18:02:00 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 18:02:00 proxies = OrderedDict() 18:02:00 18:02:00 def send( 18:02:00 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 18:02:00 ): 18:02:00 """Sends PreparedRequest object. Returns Response object. 18:02:00 18:02:00 :param request: The :class:`PreparedRequest ` being sent. 18:02:00 :param stream: (optional) Whether to stream the request content. 18:02:00 :param timeout: (optional) How long to wait for the server to send 18:02:00 data before giving up, as a float, or a :ref:`(connect timeout, 18:02:00 read timeout) ` tuple. 18:02:00 :type timeout: float or tuple or urllib3 Timeout object 18:02:00 :param verify: (optional) Either a boolean, in which case it controls whether 18:02:00 we verify the server's TLS certificate, or a string, in which case it 18:02:00 must be a path to a CA bundle to use 18:02:00 :param cert: (optional) Any user-provided SSL certificate to be trusted. 18:02:00 :param proxies: (optional) The proxies dictionary to apply to the request. 18:02:00 :rtype: requests.Response 18:02:00 """ 18:02:00 18:02:00 try: 18:02:00 conn = self.get_connection_with_tls_context( 18:02:00 request, verify, proxies=proxies, cert=cert 18:02:00 ) 18:02:00 except LocationValueError as e: 18:02:00 raise InvalidURL(e, request=request) 18:02:00 18:02:00 self.cert_verify(conn, request.url, verify, cert) 18:02:00 url = self.request_url(request, proxies) 18:02:00 self.add_headers( 18:02:00 request, 18:02:00 stream=stream, 18:02:00 timeout=timeout, 18:02:00 verify=verify, 18:02:00 cert=cert, 18:02:00 proxies=proxies, 18:02:00 ) 18:02:00 18:02:00 chunked = not (request.body is None or "Content-Length" in request.headers) 18:02:00 18:02:00 if isinstance(timeout, tuple): 18:02:00 try: 18:02:00 connect, read = timeout 18:02:00 timeout = TimeoutSauce(connect=connect, read=read) 18:02:00 except ValueError: 18:02:00 raise ValueError( 18:02:00 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 18:02:00 f"or a single float to set both timeouts to the same value." 18:02:00 ) 18:02:00 elif isinstance(timeout, TimeoutSauce): 18:02:00 pass 18:02:00 else: 18:02:00 timeout = TimeoutSauce(connect=timeout, read=timeout) 18:02:00 18:02:00 try: 18:02:00 > resp = conn.urlopen( 18:02:00 method=request.method, 18:02:00 url=url, 18:02:00 body=request.body, 18:02:00 headers=request.headers, 18:02:00 redirect=False, 18:02:00 assert_same_host=False, 18:02:00 preload_content=False, 18:02:00 decode_content=False, 18:02:00 retries=self.max_retries, 18:02:00 timeout=timeout, 18:02:00 chunked=chunked, 18:02:00 ) 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/adapters.py:644: 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 18:02:00 retries = retries.increment( 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/util/retry.py:490: in increment 18:02:00 raise reraise(type(error), error, _stacktrace) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/util/util.py:39: in reraise 18:02:00 raise value 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:787: in urlopen 18:02:00 response = self._make_request( 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:536: in _make_request 18:02:00 self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 18:02:00 self = 18:02:00 err = TimeoutError('timed out') 18:02:00 url = '/rests/operations/transportpce-device-renderer:service-path' 18:02:00 timeout_value = 30 18:02:00 18:02:00 def _raise_timeout( 18:02:00 self, 18:02:00 err: BaseSSLError | OSError | SocketTimeout, 18:02:00 url: str, 18:02:00 timeout_value: _TYPE_TIMEOUT | None, 18:02:00 ) -> None: 18:02:00 """Is the error actually a timeout? Will raise a ReadTimeout or pass""" 18:02:00 18:02:00 if isinstance(err, SocketTimeout): 18:02:00 > raise ReadTimeoutError( 18:02:00 self, url, f"Read timed out. (read timeout={timeout_value})" 18:02:00 ) from err 18:02:00 E urllib3.exceptions.ReadTimeoutError: HTTPConnectionPool(host='localhost', port=8184): Read timed out. (read timeout=30) 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:367: ReadTimeoutError 18:02:00 18:02:00 During handling of the above exception, another exception occurred: 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_40_service_path_create_otuc4(self): 18:02:00 > response = test_utils.transportpce_api_rpc_request( 18:02:00 'transportpce-device-renderer', 'service-path', 18:02:00 { 18:02:00 'service-name': 'service_OTUC4', 18:02:00 'wave-number': '0', 18:02:00 'modulation-format': 'dp-qam16', 18:02:00 'operation': 'create', 18:02:00 'nodes': [{'node-id': 'XPDR-A2', 'dest-tp': 'XPDR2-NETWORK1'}], 18:02:00 'center-freq': 196.1, 18:02:00 'nmc-width': 75, 18:02:00 'min-freq': 196.0375, 18:02:00 'max-freq': 196.125, 18:02:00 'lower-spectral-slot-number': 755, 18:02:00 'higher-spectral-slot-number': 768 18:02:00 }) 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:675: 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 transportpce_tests/common/test_utils.py:751: in transportpce_api_rpc_request 18:02:00 response = post_request(url, data) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 transportpce_tests/common/test_utils.py:143: in post_request 18:02:00 return requests.request( 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/api.py:59: in request 18:02:00 return session.request(method=method, url=url, **kwargs) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/sessions.py:589: in request 18:02:00 resp = self.send(prep, **send_kwargs) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/sessions.py:703: in send 18:02:00 r = adapter.send(request, **kwargs) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 18:02:00 self = 18:02:00 request = , stream = False 18:02:00 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 18:02:00 proxies = OrderedDict() 18:02:00 18:02:00 def send( 18:02:00 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 18:02:00 ): 18:02:00 """Sends PreparedRequest object. Returns Response object. 18:02:00 18:02:00 :param request: The :class:`PreparedRequest ` being sent. 18:02:00 :param stream: (optional) Whether to stream the request content. 18:02:00 :param timeout: (optional) How long to wait for the server to send 18:02:00 data before giving up, as a float, or a :ref:`(connect timeout, 18:02:00 read timeout) ` tuple. 18:02:00 :type timeout: float or tuple or urllib3 Timeout object 18:02:00 :param verify: (optional) Either a boolean, in which case it controls whether 18:02:00 we verify the server's TLS certificate, or a string, in which case it 18:02:00 must be a path to a CA bundle to use 18:02:00 :param cert: (optional) Any user-provided SSL certificate to be trusted. 18:02:00 :param proxies: (optional) The proxies dictionary to apply to the request. 18:02:00 :rtype: requests.Response 18:02:00 """ 18:02:00 18:02:00 try: 18:02:00 conn = self.get_connection_with_tls_context( 18:02:00 request, verify, proxies=proxies, cert=cert 18:02:00 ) 18:02:00 except LocationValueError as e: 18:02:00 raise InvalidURL(e, request=request) 18:02:00 18:02:00 self.cert_verify(conn, request.url, verify, cert) 18:02:00 url = self.request_url(request, proxies) 18:02:00 self.add_headers( 18:02:00 request, 18:02:00 stream=stream, 18:02:00 timeout=timeout, 18:02:00 verify=verify, 18:02:00 cert=cert, 18:02:00 proxies=proxies, 18:02:00 ) 18:02:00 18:02:00 chunked = not (request.body is None or "Content-Length" in request.headers) 18:02:00 18:02:00 if isinstance(timeout, tuple): 18:02:00 try: 18:02:00 connect, read = timeout 18:02:00 timeout = TimeoutSauce(connect=connect, read=read) 18:02:00 except ValueError: 18:02:00 raise ValueError( 18:02:00 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 18:02:00 f"or a single float to set both timeouts to the same value." 18:02:00 ) 18:02:00 elif isinstance(timeout, TimeoutSauce): 18:02:00 pass 18:02:00 else: 18:02:00 timeout = TimeoutSauce(connect=timeout, read=timeout) 18:02:00 18:02:00 try: 18:02:00 resp = conn.urlopen( 18:02:00 method=request.method, 18:02:00 url=url, 18:02:00 body=request.body, 18:02:00 headers=request.headers, 18:02:00 redirect=False, 18:02:00 assert_same_host=False, 18:02:00 preload_content=False, 18:02:00 decode_content=False, 18:02:00 retries=self.max_retries, 18:02:00 timeout=timeout, 18:02:00 chunked=chunked, 18:02:00 ) 18:02:00 18:02:00 except (ProtocolError, OSError) as err: 18:02:00 raise ConnectionError(err, request=request) 18:02:00 18:02:00 except MaxRetryError as e: 18:02:00 if isinstance(e.reason, ConnectTimeoutError): 18:02:00 # TODO: Remove this in 3.0.0: see #2811 18:02:00 if not isinstance(e.reason, NewConnectionError): 18:02:00 raise ConnectTimeout(e, request=request) 18:02:00 18:02:00 if isinstance(e.reason, ResponseError): 18:02:00 raise RetryError(e, request=request) 18:02:00 18:02:00 if isinstance(e.reason, _ProxyError): 18:02:00 raise ProxyError(e, request=request) 18:02:00 18:02:00 if isinstance(e.reason, _SSLError): 18:02:00 # This branch is for urllib3 v1.22 and later. 18:02:00 raise SSLError(e, request=request) 18:02:00 18:02:00 raise ConnectionError(e, request=request) 18:02:00 18:02:00 except ClosedPoolError as e: 18:02:00 raise ConnectionError(e, request=request) 18:02:00 18:02:00 except _ProxyError as e: 18:02:00 raise ProxyError(e) 18:02:00 18:02:00 except (_SSLError, _HTTPError) as e: 18:02:00 if isinstance(e, _SSLError): 18:02:00 # This branch is for urllib3 versions earlier than v1.22 18:02:00 raise SSLError(e, request=request) 18:02:00 elif isinstance(e, ReadTimeoutError): 18:02:00 > raise ReadTimeout(e, request=request) 18:02:00 E requests.exceptions.ReadTimeout: HTTPConnectionPool(host='localhost', port=8184): Read timed out. (read timeout=30) 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/adapters.py:690: ReadTimeout 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_40_service_path_create_otuc4 18:02:00 _________ TestTransportPCEOtnRenderer.test_41_get_portmapping_network1 _________ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_41_get_portmapping_network1(self): 18:02:00 response = test_utils.get_portmapping_node_attr("XPDR-A2", "mapping", "XPDR2-NETWORK1") 18:02:00 self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 self.NETWORK2_CHECK_DICT["supporting-otucn"] = "XPDR2-NETWORK1-OTUC4" 18:02:00 expected_sorted = test_utils.recursive_sort(self.NETWORK2_CHECK_DICT) 18:02:00 response_sorted = [ 18:02:00 test_utils.recursive_sort(item) for item in response['mapping'] 18:02:00 ] 18:02:00 > self.assertIn(expected_sorted, response_sorted) 18:02:00 E AssertionError: {'lcp-hash-val': 'LY9PxYJqUbw=', 'logical-connection-point': 'XPDR2-NETWORK1', 'port-admin-state': 'InService', 'port-direction': 'bidirectional', 'port-oper-state': 'InService', 'port-qual': 'switch-network', 'rate': '200', 'supported-interface-capability': ['org-openroadm-port-types:if-otsi-otsigroup'], 'supported-operational-mode': ['OR-W-100G-oFEC-31.6Gbd', 'OR-W-200G-oFEC-31.6Gbd'], 'supporting-circuit-pack-name': '1/2/2-PLUG-NET', 'supporting-otucn': 'XPDR2-NETWORK1-OTUC4', 'supporting-port': 'L1', 'xpdr-type': 'mpdr'} not found in [{'lcp-hash-val': 'LY9PxYJqUbw=', 'logical-connection-point': 'XPDR2-NETWORK1', 'port-admin-state': 'InService', 'port-direction': 'bidirectional', 'port-oper-state': 'InService', 'port-qual': 'switch-network', 'rate': '200', 'supported-interface-capability': ['org-openroadm-port-types:if-otsi-otsigroup'], 'supported-operational-mode': ['OR-W-100G-oFEC-31.6Gbd', 'OR-W-200G-oFEC-31.6Gbd'], 'supporting-circuit-pack-name': '1/2/2-PLUG-NET', 'supporting-port': 'L1', 'xpdr-type': 'mpdr'}] 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:709: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_41_get_portmapping_network1 18:02:00 ___________ TestTransportPCEOtnRenderer.test_42_check_interface_otsi ___________ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_42_check_interface_otsi(self): 18:02:00 # pylint: disable=line-too-long 18:02:00 response = test_utils.check_node_attribute_request("XPDR-A2", "interface", "XPDR2-NETWORK1-755:768") 18:02:00 > self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 E AssertionError: 409 != 200 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:714: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_42_check_interface_otsi 18:02:00 __________ TestTransportPCEOtnRenderer.test_43_check_interface_otsig ___________ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_43_check_interface_otsig(self): 18:02:00 response = test_utils.check_node_attribute_request( 18:02:00 "XPDR-A2", "interface", "XPDR2-NETWORK1-OTSIGROUP-400G") 18:02:00 > self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 E AssertionError: 409 != 200 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:740: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_43_check_interface_otsig 18:02:00 __________ TestTransportPCEOtnRenderer.test_44_check_interface_otuc4 ___________ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_44_check_interface_otuc4(self): 18:02:00 response = test_utils.check_node_attribute_request( 18:02:00 "XPDR-A2", "interface", "XPDR2-NETWORK1-OTUC4") 18:02:00 > self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 E AssertionError: 409 != 200 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:759: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_44_check_interface_otuc4 18:02:00 ______ TestTransportPCEOtnRenderer.test_45_otn_service_path_create_oduc3 _______ 18:02:00 18:02:00 self = 18:02:00 conn = 18:02:00 method = 'POST' 18:02:00 url = '/rests/operations/transportpce-device-renderer:otn-service-path' 18:02:00 body = '{"input": {"service-name": "service_ODUC4", "operation": "create", "service-rate": "400", "service-format": "ODU", "nodes": [{"node-id": "XPDR-A2", "network-tp": "XPDR2-NETWORK1"}]}}' 18:02:00 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '182', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 18:02:00 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 18:02:00 timeout = Timeout(connect=30, read=30, total=None), chunked = False 18:02:00 response_conn = 18:02:00 preload_content = False, decode_content = False, enforce_content_length = True 18:02:00 18:02:00 def _make_request( 18:02:00 self, 18:02:00 conn: BaseHTTPConnection, 18:02:00 method: str, 18:02:00 url: str, 18:02:00 body: _TYPE_BODY | None = None, 18:02:00 headers: typing.Mapping[str, str] | None = None, 18:02:00 retries: Retry | None = None, 18:02:00 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 18:02:00 chunked: bool = False, 18:02:00 response_conn: BaseHTTPConnection | None = None, 18:02:00 preload_content: bool = True, 18:02:00 decode_content: bool = True, 18:02:00 enforce_content_length: bool = True, 18:02:00 ) -> BaseHTTPResponse: 18:02:00 """ 18:02:00 Perform a request on a given urllib connection object taken from our 18:02:00 pool. 18:02:00 18:02:00 :param conn: 18:02:00 a connection from one of our connection pools 18:02:00 18:02:00 :param method: 18:02:00 HTTP request method (such as GET, POST, PUT, etc.) 18:02:00 18:02:00 :param url: 18:02:00 The URL to perform the request on. 18:02:00 18:02:00 :param body: 18:02:00 Data to send in the request body, either :class:`str`, :class:`bytes`, 18:02:00 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 18:02:00 18:02:00 :param headers: 18:02:00 Dictionary of custom headers to send, such as User-Agent, 18:02:00 If-None-Match, etc. If None, pool headers are used. If provided, 18:02:00 these headers completely replace any pool-specific headers. 18:02:00 18:02:00 :param retries: 18:02:00 Configure the number of retries to allow before raising a 18:02:00 :class:`~urllib3.exceptions.MaxRetryError` exception. 18:02:00 18:02:00 Pass ``None`` to retry until you receive a response. Pass a 18:02:00 :class:`~urllib3.util.retry.Retry` object for fine-grained control 18:02:00 over different types of retries. 18:02:00 Pass an integer number to retry connection errors that many times, 18:02:00 but no other types of errors. Pass zero to never retry. 18:02:00 18:02:00 If ``False``, then retries are disabled and any exception is raised 18:02:00 immediately. Also, instead of raising a MaxRetryError on redirects, 18:02:00 the redirect response will be returned. 18:02:00 18:02:00 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 18:02:00 18:02:00 :param timeout: 18:02:00 If specified, overrides the default timeout for this one 18:02:00 request. It may be a float (in seconds) or an instance of 18:02:00 :class:`urllib3.util.Timeout`. 18:02:00 18:02:00 :param chunked: 18:02:00 If True, urllib3 will send the body using chunked transfer 18:02:00 encoding. Otherwise, urllib3 will send the body using the standard 18:02:00 content-length form. Defaults to False. 18:02:00 18:02:00 :param response_conn: 18:02:00 Set this to ``None`` if you will handle releasing the connection or 18:02:00 set the connection to have the response release it. 18:02:00 18:02:00 :param preload_content: 18:02:00 If True, the response's body will be preloaded during construction. 18:02:00 18:02:00 :param decode_content: 18:02:00 If True, will attempt to decode the body based on the 18:02:00 'content-encoding' header. 18:02:00 18:02:00 :param enforce_content_length: 18:02:00 Enforce content length checking. Body returned by server must match 18:02:00 value of Content-Length header, if present. Otherwise, raise error. 18:02:00 """ 18:02:00 self.num_requests += 1 18:02:00 18:02:00 timeout_obj = self._get_timeout(timeout) 18:02:00 timeout_obj.start_connect() 18:02:00 conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) 18:02:00 18:02:00 try: 18:02:00 # Trigger any extra validation we need to do. 18:02:00 try: 18:02:00 self._validate_conn(conn) 18:02:00 except (SocketTimeout, BaseSSLError) as e: 18:02:00 self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) 18:02:00 raise 18:02:00 18:02:00 # _validate_conn() starts the connection to an HTTPS proxy 18:02:00 # so we need to wrap errors with 'ProxyError' here too. 18:02:00 except ( 18:02:00 OSError, 18:02:00 NewConnectionError, 18:02:00 TimeoutError, 18:02:00 BaseSSLError, 18:02:00 CertificateError, 18:02:00 SSLError, 18:02:00 ) as e: 18:02:00 new_e: Exception = e 18:02:00 if isinstance(e, (BaseSSLError, CertificateError)): 18:02:00 new_e = SSLError(e) 18:02:00 # If the connection didn't successfully connect to it's proxy 18:02:00 # then there 18:02:00 if isinstance( 18:02:00 new_e, (OSError, NewConnectionError, TimeoutError, SSLError) 18:02:00 ) and (conn and conn.proxy and not conn.has_connected_to_proxy): 18:02:00 new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) 18:02:00 raise new_e 18:02:00 18:02:00 # conn.request() calls http.client.*.request, not the method in 18:02:00 # urllib3.request. It also calls makefile (recv) on the socket. 18:02:00 try: 18:02:00 conn.request( 18:02:00 method, 18:02:00 url, 18:02:00 body=body, 18:02:00 headers=headers, 18:02:00 chunked=chunked, 18:02:00 preload_content=preload_content, 18:02:00 decode_content=decode_content, 18:02:00 enforce_content_length=enforce_content_length, 18:02:00 ) 18:02:00 18:02:00 # We are swallowing BrokenPipeError (errno.EPIPE) since the server is 18:02:00 # legitimately able to close the connection after sending a valid response. 18:02:00 # With this behaviour, the received response is still readable. 18:02:00 except BrokenPipeError: 18:02:00 pass 18:02:00 except OSError as e: 18:02:00 # MacOS/Linux 18:02:00 # EPROTOTYPE and ECONNRESET are needed on macOS 18:02:00 # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ 18:02:00 # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. 18:02:00 if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: 18:02:00 raise 18:02:00 18:02:00 # Reset the timeout for the recv() on the socket 18:02:00 read_timeout = timeout_obj.read_timeout 18:02:00 18:02:00 if not conn.is_closed: 18:02:00 # In Python 3 socket.py will catch EAGAIN and return None when you 18:02:00 # try and read into the file pointer created by http.client, which 18:02:00 # instead raises a BadStatusLine exception. Instead of catching 18:02:00 # the exception and assuming all BadStatusLine exceptions are read 18:02:00 # timeouts, check for a zero timeout before making the request. 18:02:00 if read_timeout == 0: 18:02:00 raise ReadTimeoutError( 18:02:00 self, url, f"Read timed out. (read timeout={read_timeout})" 18:02:00 ) 18:02:00 conn.timeout = read_timeout 18:02:00 18:02:00 # Receive the response from the server 18:02:00 try: 18:02:00 > response = conn.getresponse() 18:02:00 ^^^^^^^^^^^^^^^^^^ 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:534: 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connection.py:571: in getresponse 18:02:00 httplib_response = super().getresponse() 18:02:00 ^^^^^^^^^^^^^^^^^^^^^ 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1395: in getresponse 18:02:00 response.begin() 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:325: in begin 18:02:00 version, status, reason = self._read_status() 18:02:00 ^^^^^^^^^^^^^^^^^^^ 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:286: in _read_status 18:02:00 line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 18:02:00 self = 18:02:00 b = 18:02:00 18:02:00 def readinto(self, b): 18:02:00 """Read up to len(b) bytes into the writable buffer *b* and return 18:02:00 the number of bytes read. If the socket is non-blocking and no bytes 18:02:00 are available, None is returned. 18:02:00 18:02:00 If *b* is non-empty, a 0 return value indicates that the connection 18:02:00 was shutdown at the other end. 18:02:00 """ 18:02:00 self._checkClosed() 18:02:00 self._checkReadable() 18:02:00 if self._timeout_occurred: 18:02:00 raise OSError("cannot read from timed out object") 18:02:00 while True: 18:02:00 try: 18:02:00 > return self._sock.recv_into(b) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 E TimeoutError: timed out 18:02:00 18:02:00 /opt/pyenv/versions/3.11.10/lib/python3.11/socket.py:718: TimeoutError 18:02:00 18:02:00 The above exception was the direct cause of the following exception: 18:02:00 18:02:00 self = 18:02:00 request = , stream = False 18:02:00 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 18:02:00 proxies = OrderedDict() 18:02:00 18:02:00 def send( 18:02:00 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 18:02:00 ): 18:02:00 """Sends PreparedRequest object. Returns Response object. 18:02:00 18:02:00 :param request: The :class:`PreparedRequest ` being sent. 18:02:00 :param stream: (optional) Whether to stream the request content. 18:02:00 :param timeout: (optional) How long to wait for the server to send 18:02:00 data before giving up, as a float, or a :ref:`(connect timeout, 18:02:00 read timeout) ` tuple. 18:02:00 :type timeout: float or tuple or urllib3 Timeout object 18:02:00 :param verify: (optional) Either a boolean, in which case it controls whether 18:02:00 we verify the server's TLS certificate, or a string, in which case it 18:02:00 must be a path to a CA bundle to use 18:02:00 :param cert: (optional) Any user-provided SSL certificate to be trusted. 18:02:00 :param proxies: (optional) The proxies dictionary to apply to the request. 18:02:00 :rtype: requests.Response 18:02:00 """ 18:02:00 18:02:00 try: 18:02:00 conn = self.get_connection_with_tls_context( 18:02:00 request, verify, proxies=proxies, cert=cert 18:02:00 ) 18:02:00 except LocationValueError as e: 18:02:00 raise InvalidURL(e, request=request) 18:02:00 18:02:00 self.cert_verify(conn, request.url, verify, cert) 18:02:00 url = self.request_url(request, proxies) 18:02:00 self.add_headers( 18:02:00 request, 18:02:00 stream=stream, 18:02:00 timeout=timeout, 18:02:00 verify=verify, 18:02:00 cert=cert, 18:02:00 proxies=proxies, 18:02:00 ) 18:02:00 18:02:00 chunked = not (request.body is None or "Content-Length" in request.headers) 18:02:00 18:02:00 if isinstance(timeout, tuple): 18:02:00 try: 18:02:00 connect, read = timeout 18:02:00 timeout = TimeoutSauce(connect=connect, read=read) 18:02:00 except ValueError: 18:02:00 raise ValueError( 18:02:00 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 18:02:00 f"or a single float to set both timeouts to the same value." 18:02:00 ) 18:02:00 elif isinstance(timeout, TimeoutSauce): 18:02:00 pass 18:02:00 else: 18:02:00 timeout = TimeoutSauce(connect=timeout, read=timeout) 18:02:00 18:02:00 try: 18:02:00 > resp = conn.urlopen( 18:02:00 method=request.method, 18:02:00 url=url, 18:02:00 body=request.body, 18:02:00 headers=request.headers, 18:02:00 redirect=False, 18:02:00 assert_same_host=False, 18:02:00 preload_content=False, 18:02:00 decode_content=False, 18:02:00 retries=self.max_retries, 18:02:00 timeout=timeout, 18:02:00 chunked=chunked, 18:02:00 ) 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/adapters.py:644: 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 18:02:00 retries = retries.increment( 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/util/retry.py:490: in increment 18:02:00 raise reraise(type(error), error, _stacktrace) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/util/util.py:39: in reraise 18:02:00 raise value 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:787: in urlopen 18:02:00 response = self._make_request( 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:536: in _make_request 18:02:00 self._raise_timeout(err=e, url=url, timeout_value=read_timeout) 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 18:02:00 self = 18:02:00 err = TimeoutError('timed out') 18:02:00 url = '/rests/operations/transportpce-device-renderer:otn-service-path' 18:02:00 timeout_value = 30 18:02:00 18:02:00 def _raise_timeout( 18:02:00 self, 18:02:00 err: BaseSSLError | OSError | SocketTimeout, 18:02:00 url: str, 18:02:00 timeout_value: _TYPE_TIMEOUT | None, 18:02:00 ) -> None: 18:02:00 """Is the error actually a timeout? Will raise a ReadTimeout or pass""" 18:02:00 18:02:00 if isinstance(err, SocketTimeout): 18:02:00 > raise ReadTimeoutError( 18:02:00 self, url, f"Read timed out. (read timeout={timeout_value})" 18:02:00 ) from err 18:02:00 E urllib3.exceptions.ReadTimeoutError: HTTPConnectionPool(host='localhost', port=8184): Read timed out. (read timeout=30) 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/urllib3/connectionpool.py:367: ReadTimeoutError 18:02:00 18:02:00 During handling of the above exception, another exception occurred: 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_45_otn_service_path_create_oduc3(self): 18:02:00 > response = test_utils.transportpce_api_rpc_request( 18:02:00 'transportpce-device-renderer', 'otn-service-path', 18:02:00 { 18:02:00 'service-name': 'service_ODUC4', 18:02:00 'operation': 'create', 18:02:00 'service-rate': '400', 18:02:00 'service-format': 'ODU', 18:02:00 'nodes': [{'node-id': 'XPDR-A2', 'network-tp': 'XPDR2-NETWORK1'}] 18:02:00 }) 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:780: 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 transportpce_tests/common/test_utils.py:751: in transportpce_api_rpc_request 18:02:00 response = post_request(url, data) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 transportpce_tests/common/test_utils.py:143: in post_request 18:02:00 return requests.request( 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/api.py:59: in request 18:02:00 return session.request(method=method, url=url, **kwargs) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/sessions.py:589: in request 18:02:00 resp = self.send(prep, **send_kwargs) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/sessions.py:703: in send 18:02:00 r = adapter.send(request, **kwargs) 18:02:00 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 18:02:00 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 18:02:00 18:02:00 self = 18:02:00 request = , stream = False 18:02:00 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 18:02:00 proxies = OrderedDict() 18:02:00 18:02:00 def send( 18:02:00 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 18:02:00 ): 18:02:00 """Sends PreparedRequest object. Returns Response object. 18:02:00 18:02:00 :param request: The :class:`PreparedRequest ` being sent. 18:02:00 :param stream: (optional) Whether to stream the request content. 18:02:00 :param timeout: (optional) How long to wait for the server to send 18:02:00 data before giving up, as a float, or a :ref:`(connect timeout, 18:02:00 read timeout) ` tuple. 18:02:00 :type timeout: float or tuple or urllib3 Timeout object 18:02:00 :param verify: (optional) Either a boolean, in which case it controls whether 18:02:00 we verify the server's TLS certificate, or a string, in which case it 18:02:00 must be a path to a CA bundle to use 18:02:00 :param cert: (optional) Any user-provided SSL certificate to be trusted. 18:02:00 :param proxies: (optional) The proxies dictionary to apply to the request. 18:02:00 :rtype: requests.Response 18:02:00 """ 18:02:00 18:02:00 try: 18:02:00 conn = self.get_connection_with_tls_context( 18:02:00 request, verify, proxies=proxies, cert=cert 18:02:00 ) 18:02:00 except LocationValueError as e: 18:02:00 raise InvalidURL(e, request=request) 18:02:00 18:02:00 self.cert_verify(conn, request.url, verify, cert) 18:02:00 url = self.request_url(request, proxies) 18:02:00 self.add_headers( 18:02:00 request, 18:02:00 stream=stream, 18:02:00 timeout=timeout, 18:02:00 verify=verify, 18:02:00 cert=cert, 18:02:00 proxies=proxies, 18:02:00 ) 18:02:00 18:02:00 chunked = not (request.body is None or "Content-Length" in request.headers) 18:02:00 18:02:00 if isinstance(timeout, tuple): 18:02:00 try: 18:02:00 connect, read = timeout 18:02:00 timeout = TimeoutSauce(connect=connect, read=read) 18:02:00 except ValueError: 18:02:00 raise ValueError( 18:02:00 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 18:02:00 f"or a single float to set both timeouts to the same value." 18:02:00 ) 18:02:00 elif isinstance(timeout, TimeoutSauce): 18:02:00 pass 18:02:00 else: 18:02:00 timeout = TimeoutSauce(connect=timeout, read=timeout) 18:02:00 18:02:00 try: 18:02:00 resp = conn.urlopen( 18:02:00 method=request.method, 18:02:00 url=url, 18:02:00 body=request.body, 18:02:00 headers=request.headers, 18:02:00 redirect=False, 18:02:00 assert_same_host=False, 18:02:00 preload_content=False, 18:02:00 decode_content=False, 18:02:00 retries=self.max_retries, 18:02:00 timeout=timeout, 18:02:00 chunked=chunked, 18:02:00 ) 18:02:00 18:02:00 except (ProtocolError, OSError) as err: 18:02:00 raise ConnectionError(err, request=request) 18:02:00 18:02:00 except MaxRetryError as e: 18:02:00 if isinstance(e.reason, ConnectTimeoutError): 18:02:00 # TODO: Remove this in 3.0.0: see #2811 18:02:00 if not isinstance(e.reason, NewConnectionError): 18:02:00 raise ConnectTimeout(e, request=request) 18:02:00 18:02:00 if isinstance(e.reason, ResponseError): 18:02:00 raise RetryError(e, request=request) 18:02:00 18:02:00 if isinstance(e.reason, _ProxyError): 18:02:00 raise ProxyError(e, request=request) 18:02:00 18:02:00 if isinstance(e.reason, _SSLError): 18:02:00 # This branch is for urllib3 v1.22 and later. 18:02:00 raise SSLError(e, request=request) 18:02:00 18:02:00 raise ConnectionError(e, request=request) 18:02:00 18:02:00 except ClosedPoolError as e: 18:02:00 raise ConnectionError(e, request=request) 18:02:00 18:02:00 except _ProxyError as e: 18:02:00 raise ProxyError(e) 18:02:00 18:02:00 except (_SSLError, _HTTPError) as e: 18:02:00 if isinstance(e, _SSLError): 18:02:00 # This branch is for urllib3 versions earlier than v1.22 18:02:00 raise SSLError(e, request=request) 18:02:00 elif isinstance(e, ReadTimeoutError): 18:02:00 > raise ReadTimeout(e, request=request) 18:02:00 E requests.exceptions.ReadTimeout: HTTPConnectionPool(host='localhost', port=8184): Read timed out. (read timeout=30) 18:02:00 18:02:00 ../.tox/tests71/lib/python3.11/site-packages/requests/adapters.py:690: ReadTimeout 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_45_otn_service_path_create_oduc3 18:02:00 _________ TestTransportPCEOtnRenderer.test_46_get_portmapping_network1 _________ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_46_get_portmapping_network1(self): 18:02:00 response = test_utils.get_portmapping_node_attr("XPDR-A2", "mapping", "XPDR2-NETWORK1") 18:02:00 self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 self.NETWORK2_CHECK_DICT["supporting-oducn"] = "XPDR2-NETWORK1-ODUC4" 18:02:00 expected_sorted = test_utils.recursive_sort(self.NETWORK2_CHECK_DICT) 18:02:00 response_sorted = [ 18:02:00 test_utils.recursive_sort(item) for item in response['mapping'] 18:02:00 ] 18:02:00 > self.assertIn(expected_sorted, response_sorted) 18:02:00 E AssertionError: {'lcp-hash-val': 'LY9PxYJqUbw=', 'logical-connection-point': 'XPDR2-NETWORK1', 'port-admin-state': 'InService', 'port-direction': 'bidirectional', 'port-oper-state': 'InService', 'port-qual': 'switch-network', 'rate': '200', 'supported-interface-capability': ['org-openroadm-port-types:if-otsi-otsigroup'], 'supported-operational-mode': ['OR-W-100G-oFEC-31.6Gbd', 'OR-W-200G-oFEC-31.6Gbd'], 'supporting-circuit-pack-name': '1/2/2-PLUG-NET', 'supporting-oducn': 'XPDR2-NETWORK1-ODUC4', 'supporting-otucn': 'XPDR2-NETWORK1-OTUC4', 'supporting-port': 'L1', 'xpdr-type': 'mpdr'} not found in [{'lcp-hash-val': 'LY9PxYJqUbw=', 'logical-connection-point': 'XPDR2-NETWORK1', 'port-admin-state': 'InService', 'port-direction': 'bidirectional', 'port-oper-state': 'InService', 'port-qual': 'switch-network', 'rate': '200', 'supported-interface-capability': ['org-openroadm-port-types:if-otsi-otsigroup'], 'supported-operational-mode': ['OR-W-100G-oFEC-31.6Gbd', 'OR-W-200G-oFEC-31.6Gbd'], 'supporting-circuit-pack-name': '1/2/2-PLUG-NET', 'supporting-port': 'L1', 'xpdr-type': 'mpdr'}] 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:803: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_46_get_portmapping_network1 18:02:00 __________ TestTransportPCEOtnRenderer.test_47_check_interface_oduc4 ___________ 18:02:00 18:02:00 self = 18:02:00 18:02:00 def test_47_check_interface_oduc4(self): 18:02:00 response = test_utils.check_node_attribute_request("XPDR-A2", "interface", "XPDR2-NETWORK1-ODUC4") 18:02:00 > self.assertEqual(response['status_code'], requests.codes.ok) 18:02:00 E AssertionError: 409 != 200 18:02:00 18:02:00 transportpce_tests/7.1/test02_otn_renderer.py:807: AssertionError 18:02:00 ----------------------------- Captured stdout call ----------------------------- 18:02:00 execution of test_47_check_interface_oduc4 18:02:00 =========================== short test summary info ============================ 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_02_service_path_create_otuc2 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_03_get_portmapping_network1 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_04_check_interface_otsi 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_05_check_interface_otsig 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_06_check_interface_otuc2 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_07_otn_service_path_create_oduc2 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_08_get_portmapping_network1 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_09_check_interface_oduc2 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_10_otn_service_path_create_100ge 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_11_check_interface_100ge_client 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_12_check_interface_odu4_client 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_13_check_interface_odu4_network 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_14_check_odu_connection_xpdra2 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_26_service_path_create_otuc3 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_27_get_portmapping_network1 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_28_check_interface_otsi 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_29_check_interface_otsig 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_30_check_interface_otuc3 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_31_otn_service_path_create_oduc3 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_32_get_portmapping_network1 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_33_check_interface_oduc3 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_40_service_path_create_otuc4 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_41_get_portmapping_network1 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_42_check_interface_otsi 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_43_check_interface_otsig 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_44_check_interface_otuc4 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_45_otn_service_path_create_oduc3 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_46_get_portmapping_network1 18:02:00 FAILED transportpce_tests/7.1/test02_otn_renderer.py::TestTransportPCEOtnRenderer::test_47_check_interface_oduc4 18:02:00 29 failed, 33 passed in 370.81s (0:06:10) 18:02:00 tests121: FAIL ✖ in 7 minutes 47.44 seconds 18:02:00 tests71: exit 1 (425.80 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 7.1 pid=12492 18:02:06 .................. [100%] 18:05:12 36 passed in 338.89s (0:05:38) 18:05:12 pytest -q transportpce_tests/tapi/test03_tapi_device_change_notifications.py 18:06:00 ....................................................................... [100%] 18:10:31 71 passed in 318.49s (0:05:18) 18:10:31 pytest -q transportpce_tests/tapi/test04_topo_extension.py 18:11:22 ................... [100%] 18:12:54 19 passed in 142.36s (0:02:22) 18:12:54 pytest -q transportpce_tests/tapi/test05_pce_tapi.py 18:14:56 ...................... [100%] 18:23:32 22 passed in 638.17s (0:10:38) 18:23:32 tests71: FAIL ✖ in 7 minutes 13.52 seconds 18:23:32 tests_tapi: OK ✔ in 32 minutes 17.01 seconds 18:23:32 tests221: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 18:23:40 tests221: freeze> python -m pip freeze --all 18:23:40 tests221: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.2.1,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==80.9.0,urllib3==2.6.3 18:23:40 tests221: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 2.2.1 18:23:40 using environment variables from ./karaf221.env 18:23:40 pytest -q transportpce_tests/2.2.1/test01_portmapping.py 18:24:18 ................................... [100%] 18:24:58 35 passed in 77.79s (0:01:17) 18:24:58 pytest -q transportpce_tests/2.2.1/test02_topo_portmapping.py 18:25:30 ...... [100%] 18:25:44 6 passed in 45.39s 18:25:44 pytest -q transportpce_tests/2.2.1/test03_topology.py 18:26:28 ............................................ [100%] 18:28:03 44 passed in 138.51s (0:02:18) 18:28:03 pytest -q transportpce_tests/2.2.1/test04_otn_topology.py 18:28:39 ............ [100%] 18:29:03 12 passed in 59.93s 18:29:03 pytest -q transportpce_tests/2.2.1/test05_flex_grid.py 18:29:29 ................ [100%] 18:30:58 16 passed in 114.92s (0:01:54) 18:30:58 pytest -q transportpce_tests/2.2.1/test06_renderer_service_path_nominal.py 18:31:29 ............................... [100%] 18:31:36 31 passed in 37.28s 18:31:36 pytest -q transportpce_tests/2.2.1/test07_otn_renderer.py 18:32:12 .......................... [100%] 18:33:07 26 passed in 91.23s (0:01:31) 18:33:07 pytest -q transportpce_tests/2.2.1/test08_otn_sh_renderer.py 18:33:44 ...................... [100%] 18:34:47 22 passed in 100.02s (0:01:40) 18:34:47 pytest -q transportpce_tests/2.2.1/test09_olm.py 18:35:30 ........................................ [100%] 18:37:52 40 passed in 184.30s (0:03:04) 18:37:52 pytest -q transportpce_tests/2.2.1/test11_otn_end2end.py 18:38:36 ........................................................................ [ 74%] 18:44:12 ......................... [100%] 18:46:04 97 passed in 492.10s (0:08:12) 18:46:04 pytest -q transportpce_tests/2.2.1/test12_end2end.py 18:46:45 ...................................................... [100%] 18:53:33 54 passed in 448.53s (0:07:28) 18:53:33 pytest -q transportpce_tests/2.2.1/test14_otn_switch_end2end.py 18:54:29 ........................................................................ [ 71%] 18:59:37 ............................. [100%] 19:01:47 101 passed in 493.20s (0:08:13) 19:01:47 pytest -q transportpce_tests/2.2.1/test15_otn_end2end_with_intermediate_switch.py 19:02:42 ........................................................................ [ 67%] 19:08:29 ................................... [100%] 19:14:50 107 passed in 782.79s (0:13:02) 19:14:50 pytest -q transportpce_tests/2.2.1/test16_freq_end2end.py 19:15:33 ............................................. [100%] 19:18:11 45 passed in 200.85s (0:03:20) 19:18:11 tests221: OK ✔ in 54 minutes 38.76 seconds 19:18:11 tests_hybrid: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 19:18:19 tests_hybrid: freeze> python -m pip freeze --all 19:18:19 tests_hybrid: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.2.1,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==80.9.0,urllib3==2.6.3 19:18:19 tests_hybrid: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh hybrid 19:18:19 using environment variables from ./karaf221.env 19:18:19 pytest -q transportpce_tests/hybrid/test01_device_change_notifications.py 19:18:59 .............FFF...................FF.............. [100%] 19:23:42 =================================== FAILURES =================================== 19:23:42 __ TestTransportPCEDeviceChangeNotifications.test_14_check_update_portmapping __ 19:23:42 19:23:42 self = 19:23:42 19:23:42 def test_14_check_update_portmapping(self): 19:23:42 response = test_utils.get_portmapping_node_attr("XPDRA01", None, None) 19:23:42 self.assertEqual(response['status_code'], requests.codes.ok) 19:23:42 mapping_list = response['nodes'][0]['mapping'] 19:23:42 for mapping in mapping_list: 19:23:42 if mapping['logical-connection-point'] == 'XPDR1-NETWORK1': 19:23:42 > self.assertEqual(mapping['port-oper-state'], 'OutOfService', 19:23:42 "Operational State should be 'OutOfService'") 19:23:42 E AssertionError: 'InService' != 'OutOfService' 19:23:42 E - InService 19:23:42 E + OutOfService 19:23:42 E : Operational State should be 'OutOfService' 19:23:42 19:23:42 transportpce_tests/hybrid/test01_device_change_notifications.py:218: AssertionError 19:23:42 ----------------------------- Captured stdout call ----------------------------- 19:23:42 execution of test_14_check_update_portmapping 19:23:42 _ TestTransportPCEDeviceChangeNotifications.test_15_check_update_openroadm_topo _ 19:23:42 19:23:42 self = 19:23:42 19:23:42 def test_15_check_update_openroadm_topo(self): 19:23:42 response = test_utils.get_ietf_network_request('openroadm-topology', 'config') 19:23:42 self.assertEqual(response['status_code'], requests.codes.ok) 19:23:42 node_list = response['network'][0]['node'] 19:23:42 nb_updated_tp = 0 19:23:42 for node in node_list: 19:23:42 self.assertEqual(node['org-openroadm-common-network:operational-state'], 'inService') 19:23:42 self.assertEqual(node['org-openroadm-common-network:administrative-state'], 'inService') 19:23:42 tp_list = node['ietf-network-topology:termination-point'] 19:23:42 for tp in tp_list: 19:23:42 if node['node-id'] == 'XPDRA01-XPDR1' and tp['tp-id'] == 'XPDR1-NETWORK1': 19:23:42 > self.assertEqual(tp['org-openroadm-common-network:operational-state'], 'outOfService') 19:23:42 E AssertionError: 'inService' != 'outOfService' 19:23:42 E - inService 19:23:42 E + outOfService 19:23:42 19:23:42 transportpce_tests/hybrid/test01_device_change_notifications.py:240: AssertionError 19:23:42 ----------------------------- Captured stdout call ----------------------------- 19:23:42 execution of test_15_check_update_openroadm_topo 19:23:42 ___ TestTransportPCEDeviceChangeNotifications.test_16_check_update_service1 ____ 19:23:42 19:23:42 self = 19:23:42 19:23:42 def test_16_check_update_service1(self): 19:23:42 response = test_utils.get_ordm_serv_list_attr_request("services", "service1") 19:23:42 self.assertEqual(response['status_code'], requests.codes.ok) 19:23:42 > self.assertEqual(response['services'][0]['operational-state'], 'outOfService') 19:23:42 E AssertionError: 'inService' != 'outOfService' 19:23:42 E - inService 19:23:42 E + outOfService 19:23:42 19:23:42 transportpce_tests/hybrid/test01_device_change_notifications.py:266: AssertionError 19:23:42 ----------------------------- Captured stdout call ----------------------------- 19:23:42 execution of test_16_check_update_service1 19:23:42 __ TestTransportPCEDeviceChangeNotifications.test_36_check_update_portmapping __ 19:23:42 19:23:42 self = 19:23:42 19:23:42 def test_36_check_update_portmapping(self): 19:23:42 response = test_utils.get_portmapping_node_attr("XPDR-C1", None, None) 19:23:42 self.assertEqual(response['status_code'], requests.codes.ok) 19:23:42 mapping_list = response['nodes'][0]['mapping'] 19:23:42 for mapping in mapping_list: 19:23:42 if mapping['logical-connection-point'] == 'XPDR1-NETWORK1': 19:23:42 > self.assertEqual(mapping['port-oper-state'], 'OutOfService', 19:23:42 "Operational State should be 'OutOfService'") 19:23:42 E AssertionError: 'InService' != 'OutOfService' 19:23:42 E - InService 19:23:42 E + OutOfService 19:23:42 E : Operational State should be 'OutOfService' 19:23:42 19:23:42 transportpce_tests/hybrid/test01_device_change_notifications.py:498: AssertionError 19:23:42 ----------------------------- Captured stdout call ----------------------------- 19:23:42 execution of test_36_check_update_portmapping 19:23:42 _ TestTransportPCEDeviceChangeNotifications.test_37_check_update_openroadm_topo _ 19:23:42 19:23:42 self = 19:23:42 19:23:42 def test_37_check_update_openroadm_topo(self): 19:23:42 response = test_utils.get_ietf_network_request('openroadm-topology', 'config') 19:23:42 self.assertEqual(response['status_code'], requests.codes.ok) 19:23:42 node_list = response['network'][0]['node'] 19:23:42 nb_updated_tp = 0 19:23:42 for node in node_list: 19:23:42 self.assertEqual(node['org-openroadm-common-network:operational-state'], 'inService') 19:23:42 self.assertEqual(node['org-openroadm-common-network:administrative-state'], 'inService') 19:23:42 tp_list = node['ietf-network-topology:termination-point'] 19:23:42 for tp in tp_list: 19:23:42 if node['node-id'] == 'XPDR-C1-XPDR1' and tp['tp-id'] == 'XPDR1-NETWORK1': 19:23:42 > self.assertEqual(tp['org-openroadm-common-network:operational-state'], 'outOfService') 19:23:42 E AssertionError: 'inService' != 'outOfService' 19:23:42 E - inService 19:23:42 E + outOfService 19:23:42 19:23:42 transportpce_tests/hybrid/test01_device_change_notifications.py:520: AssertionError 19:23:42 ----------------------------- Captured stdout call ----------------------------- 19:23:42 execution of test_37_check_update_openroadm_topo 19:23:42 =========================== short test summary info ============================ 19:23:42 FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TestTransportPCEDeviceChangeNotifications::test_14_check_update_portmapping 19:23:42 FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TestTransportPCEDeviceChangeNotifications::test_15_check_update_openroadm_topo 19:23:42 FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TestTransportPCEDeviceChangeNotifications::test_16_check_update_service1 19:23:42 FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TestTransportPCEDeviceChangeNotifications::test_36_check_update_portmapping 19:23:42 FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TestTransportPCEDeviceChangeNotifications::test_37_check_update_openroadm_topo 19:23:42 5 failed, 46 passed in 322.79s (0:05:22) 19:23:42 tests_hybrid: exit 1 (323.07 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh hybrid pid=89712 19:23:42 buildcontroller: OK (113.73=setup[8.82]+cmd[104.91] seconds) 19:23:42 sims: OK (12.84=setup[8.81]+cmd[4.03] seconds) 19:23:42 build_karaf_tests121: OK (64.46=setup[7.88]+cmd[56.58] seconds) 19:23:42 testsPCE: OK (286.83=setup[53.28]+cmd[233.55] seconds) 19:23:42 tests121: FAIL code 1 (467.44=setup[8.31]+cmd[459.13] seconds) 19:23:42 build_karaf_tests221: OK (67.49=setup[7.85]+cmd[59.64] seconds) 19:23:42 tests_tapi: OK (1937.01=setup[8.32]+cmd[1928.69] seconds) 19:23:42 tests221: OK (3278.76=setup[7.74]+cmd[3271.01] seconds) 19:23:42 build_karaf_tests71: OK (67.46=setup[7.82]+cmd[59.64] seconds) 19:23:42 tests71: FAIL code 1 (433.52=setup[7.72]+cmd[425.80] seconds) 19:23:42 build_karaf_tests190: OK (67.54=setup[7.93]+cmd[59.61] seconds) 19:23:42 tests190: FAIL code 1 (211.56=setup[8.23]+cmd[203.33] seconds) 19:23:42 tests_hybrid: FAIL code 1 (330.80=setup[7.73]+cmd[323.07] seconds) 19:23:42 buildlighty: OK (35.74=setup[11.56]+cmd[24.18] seconds) 19:23:42 docs: OK (34.48=setup[31.83]+cmd[2.66] seconds) 19:23:42 docs-linkcheck: OK (36.23=setup[30.67]+cmd[5.56] seconds) 19:23:42 checkbashisms: OK (3.38=setup[2.10]+cmd[0.01,0.06,1.20] seconds) 19:23:42 pre-commit: OK (51.14=setup[3.08]+cmd[0.00,0.02,39.14,8.90] seconds) 19:23:42 pylint: OK (30.69=setup[5.06]+cmd[25.63] seconds) 19:23:42 evaluation failed :( (6014.72 seconds) 19:23:42 + tox_status=255 19:23:42 + echo '---> Completed tox runs' 19:23:42 ---> Completed tox runs 19:23:42 + for i in .tox/*/log 19:23:42 ++ echo .tox/build_karaf_tests121/log 19:23:42 ++ awk -F/ '{print $2}' 19:23:42 + tox_env=build_karaf_tests121 19:23:42 + cp -r .tox/build_karaf_tests121/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests121 19:23:42 + for i in .tox/*/log 19:23:42 ++ echo .tox/build_karaf_tests190/log 19:23:42 ++ awk -F/ '{print $2}' 19:23:42 + tox_env=build_karaf_tests190 19:23:42 + cp -r .tox/build_karaf_tests190/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests190 19:23:42 + for i in .tox/*/log 19:23:42 ++ echo .tox/build_karaf_tests221/log 19:23:42 ++ awk -F/ '{print $2}' 19:23:42 + tox_env=build_karaf_tests221 19:23:42 + cp -r .tox/build_karaf_tests221/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests221 19:23:42 + for i in .tox/*/log 19:23:42 ++ echo .tox/build_karaf_tests71/log 19:23:42 ++ awk -F/ '{print $2}' 19:23:42 + tox_env=build_karaf_tests71 19:23:42 + cp -r .tox/build_karaf_tests71/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests71 19:23:42 + for i in .tox/*/log 19:23:42 ++ echo .tox/buildcontroller/log 19:23:42 ++ awk -F/ '{print $2}' 19:23:42 + tox_env=buildcontroller 19:23:42 + cp -r .tox/buildcontroller/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/buildcontroller 19:23:42 + for i in .tox/*/log 19:23:42 ++ echo .tox/buildlighty/log 19:23:42 ++ awk -F/ '{print $2}' 19:23:42 + tox_env=buildlighty 19:23:42 + cp -r .tox/buildlighty/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/buildlighty 19:23:42 + for i in .tox/*/log 19:23:42 ++ echo .tox/checkbashisms/log 19:23:42 ++ awk -F/ '{print $2}' 19:23:42 + tox_env=checkbashisms 19:23:42 + cp -r .tox/checkbashisms/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/checkbashisms 19:23:42 + for i in .tox/*/log 19:23:42 ++ echo .tox/docs-linkcheck/log 19:23:42 ++ awk -F/ '{print $2}' 19:23:42 + tox_env=docs-linkcheck 19:23:42 + cp -r .tox/docs-linkcheck/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/docs-linkcheck 19:23:42 + for i in .tox/*/log 19:23:42 ++ echo .tox/docs/log 19:23:42 ++ awk -F/ '{print $2}' 19:23:42 + tox_env=docs 19:23:42 + cp -r .tox/docs/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/docs 19:23:42 + for i in .tox/*/log 19:23:42 ++ echo .tox/pre-commit/log 19:23:42 ++ awk -F/ '{print $2}' 19:23:42 + tox_env=pre-commit 19:23:42 + cp -r .tox/pre-commit/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/pre-commit 19:23:42 + for i in .tox/*/log 19:23:42 ++ echo .tox/pylint/log 19:23:42 ++ awk -F/ '{print $2}' 19:23:42 + tox_env=pylint 19:23:42 + cp -r .tox/pylint/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/pylint 19:23:42 + for i in .tox/*/log 19:23:42 ++ echo .tox/sims/log 19:23:42 ++ awk -F/ '{print $2}' 19:23:42 + tox_env=sims 19:23:42 + cp -r .tox/sims/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/sims 19:23:42 + for i in .tox/*/log 19:23:42 ++ echo .tox/tests121/log 19:23:42 ++ awk -F/ '{print $2}' 19:23:42 + tox_env=tests121 19:23:42 + cp -r .tox/tests121/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests121 19:23:42 + for i in .tox/*/log 19:23:42 ++ echo .tox/tests190/log 19:23:42 ++ awk -F/ '{print $2}' 19:23:42 + tox_env=tests190 19:23:42 + cp -r .tox/tests190/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests190 19:23:42 + for i in .tox/*/log 19:23:42 ++ echo .tox/tests221/log 19:23:42 ++ awk -F/ '{print $2}' 19:23:42 + tox_env=tests221 19:23:42 + cp -r .tox/tests221/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests221 19:23:42 + for i in .tox/*/log 19:23:42 ++ echo .tox/tests71/log 19:23:42 ++ awk -F/ '{print $2}' 19:23:42 + tox_env=tests71 19:23:42 + cp -r .tox/tests71/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests71 19:23:42 + for i in .tox/*/log 19:23:42 ++ echo .tox/testsPCE/log 19:23:42 ++ awk -F/ '{print $2}' 19:23:42 + tox_env=testsPCE 19:23:42 + cp -r .tox/testsPCE/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/testsPCE 19:23:42 + for i in .tox/*/log 19:23:42 ++ echo .tox/tests_hybrid/log 19:23:42 ++ awk -F/ '{print $2}' 19:23:42 + tox_env=tests_hybrid 19:23:42 + cp -r .tox/tests_hybrid/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests_hybrid 19:23:42 + for i in .tox/*/log 19:23:42 ++ echo .tox/tests_tapi/log 19:23:42 ++ awk -F/ '{print $2}' 19:23:42 + tox_env=tests_tapi 19:23:42 + cp -r .tox/tests_tapi/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests_tapi 19:23:42 + DOC_DIR=docs/_build/html 19:23:42 + [[ -d docs/_build/html ]] 19:23:42 + echo '---> Archiving generated docs' 19:23:42 ---> Archiving generated docs 19:23:42 + mv docs/_build/html /w/workspace/transportpce-tox-verify-transportpce-master/archives/docs 19:23:42 + echo '---> tox-run.sh ends' 19:23:42 ---> tox-run.sh ends 19:23:42 + test 255 -eq 0 19:23:42 + exit 255 19:23:42 ++ '[' 1 = 1 ']' 19:23:42 ++ '[' -x /usr/bin/clear_console ']' 19:23:42 ++ /usr/bin/clear_console -q 19:23:42 Build step 'Execute shell' marked build as failure 19:23:42 $ ssh-agent -k 19:23:42 unset SSH_AUTH_SOCK; 19:23:42 unset SSH_AGENT_PID; 19:23:42 echo Agent pid 1578 killed; 19:23:42 [ssh-agent] Stopped. 19:23:42 [PostBuildScript] - [INFO] Executing post build scripts. 19:23:42 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins15926917116721595274.sh 19:23:42 ---> sysstat.sh 19:23:43 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins16046077014136557279.sh 19:23:43 ---> package-listing.sh 19:23:43 ++ tr '[:upper:]' '[:lower:]' 19:23:43 ++ facter osfamily 19:23:43 + OS_FAMILY=debian 19:23:43 + workspace=/w/workspace/transportpce-tox-verify-transportpce-master 19:23:43 + START_PACKAGES=/tmp/packages_start.txt 19:23:43 + END_PACKAGES=/tmp/packages_end.txt 19:23:43 + DIFF_PACKAGES=/tmp/packages_diff.txt 19:23:43 + PACKAGES=/tmp/packages_start.txt 19:23:43 + '[' /w/workspace/transportpce-tox-verify-transportpce-master ']' 19:23:43 + PACKAGES=/tmp/packages_end.txt 19:23:43 + case "${OS_FAMILY}" in 19:23:43 + dpkg -l 19:23:43 + grep '^ii' 19:23:43 + '[' -f /tmp/packages_start.txt ']' 19:23:43 + '[' -f /tmp/packages_end.txt ']' 19:23:43 + diff /tmp/packages_start.txt /tmp/packages_end.txt 19:23:43 + '[' /w/workspace/transportpce-tox-verify-transportpce-master ']' 19:23:43 + mkdir -p /w/workspace/transportpce-tox-verify-transportpce-master/archives/ 19:23:43 + cp -f /tmp/packages_diff.txt /tmp/packages_end.txt /tmp/packages_start.txt /w/workspace/transportpce-tox-verify-transportpce-master/archives/ 19:23:43 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins9524295116893140017.sh 19:23:43 ---> capture-instance-metadata.sh 19:23:43 Setup pyenv: 19:23:44 system 19:23:44 3.8.20 19:23:44 3.9.20 19:23:44 3.10.15 19:23:44 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 19:23:44 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-6Heh from file:/tmp/.os_lf_venv 19:23:44 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 19:23:44 lf-activate-venv(): INFO: Attempting to install with network-safe options... 19:23:46 lf-activate-venv(): INFO: Base packages installed successfully 19:23:46 lf-activate-venv(): INFO: Installing additional packages: lftools 19:24:01 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 19:24:01 kubernetes 34.1.0 requires urllib3<2.4.0,>=1.24.2, but you have urllib3 2.6.3 which is incompatible. 19:24:01 lf-activate-venv(): INFO: Adding /tmp/venv-6Heh/bin to PATH 19:24:01 INFO: Running in OpenStack, capturing instance metadata 19:24:01 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins15692936524972691295.sh 19:24:01 provisioning config files... 19:24:02 Could not find credentials [logs] for transportpce-tox-verify-transportpce-master #4199 19:24:02 copy managed file [jenkins-log-archives-settings] to file:/w/workspace/transportpce-tox-verify-transportpce-master@tmp/config3070999936754050358tmp 19:24:02 Regular expression run condition: Expression=[^.*logs-s3.*], Label=[odl-logs-s3-cloudfront-index] 19:24:02 Run condition [Regular expression match] enabling perform for step [Provide Configuration files] 19:24:02 provisioning config files... 19:24:02 copy managed file [jenkins-s3-log-ship] to file:/home/jenkins/.aws/credentials 19:24:02 [EnvInject] - Injecting environment variables from a build step. 19:24:02 [EnvInject] - Injecting as environment variables the properties content 19:24:02 SERVER_ID=logs 19:24:02 19:24:02 [EnvInject] - Variables injected successfully. 19:24:02 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins3300240218304707500.sh 19:24:02 ---> create-netrc.sh 19:24:02 WARN: Log server credential not found. 19:24:02 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins9604470197601772508.sh 19:24:02 ---> python-tools-install.sh 19:24:02 Setup pyenv: 19:24:02 system 19:24:02 3.8.20 19:24:02 3.9.20 19:24:02 3.10.15 19:24:02 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 19:24:03 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-6Heh from file:/tmp/.os_lf_venv 19:24:03 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 19:24:03 lf-activate-venv(): INFO: Attempting to install with network-safe options... 19:24:04 lf-activate-venv(): INFO: Base packages installed successfully 19:24:04 lf-activate-venv(): INFO: Installing additional packages: lftools 19:24:15 lf-activate-venv(): INFO: Adding /tmp/venv-6Heh/bin to PATH 19:24:15 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins14109731553658684623.sh 19:24:15 ---> sudo-logs.sh 19:24:15 Archiving 'sudo' log.. 19:24:15 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins4815298085420521551.sh 19:24:15 ---> job-cost.sh 19:24:15 Setup pyenv: 19:24:15 system 19:24:15 3.8.20 19:24:15 3.9.20 19:24:15 3.10.15 19:24:15 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 19:24:15 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-6Heh from file:/tmp/.os_lf_venv 19:24:15 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 19:24:15 lf-activate-venv(): INFO: Attempting to install with network-safe options... 19:24:18 lf-activate-venv(): INFO: Base packages installed successfully 19:24:18 lf-activate-venv(): INFO: Installing additional packages: zipp==1.1.0 python-openstackclient urllib3~=1.26.15 19:24:24 lf-activate-venv(): INFO: Adding /tmp/venv-6Heh/bin to PATH 19:24:24 INFO: No Stack... 19:24:25 INFO: Retrieving Pricing Info for: v3-standard-4 19:24:25 INFO: Archiving Costs 19:24:25 [transportpce-tox-verify-transportpce-master] $ /bin/bash -l /tmp/jenkins9773780075929791734.sh 19:24:25 ---> logs-deploy.sh 19:24:25 Setup pyenv: 19:24:25 system 19:24:25 3.8.20 19:24:25 3.9.20 19:24:25 3.10.15 19:24:25 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 19:24:25 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-6Heh from file:/tmp/.os_lf_venv 19:24:25 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 19:24:25 lf-activate-venv(): INFO: Attempting to install with network-safe options... 19:24:27 lf-activate-venv(): INFO: Base packages installed successfully 19:24:27 lf-activate-venv(): INFO: Installing additional packages: lftools urllib3~=1.26.15 19:24:38 lf-activate-venv(): INFO: Adding /tmp/venv-6Heh/bin to PATH 19:24:38 WARNING: Nexus logging server not set 19:24:38 INFO: S3 path logs/releng/vex-yul-odl-jenkins-1/transportpce-tox-verify-transportpce-master/4199/ 19:24:38 INFO: archiving logs to S3 19:24:40 ---> uname -a: 19:24:40 Linux prd-ubuntu2204-docker-4c-16g-260 5.15.0-164-generic #174-Ubuntu SMP Fri Nov 14 20:25:16 UTC 2025 x86_64 x86_64 x86_64 GNU/Linux 19:24:40 19:24:40 19:24:40 ---> lscpu: 19:24:40 Architecture: x86_64 19:24:40 CPU op-mode(s): 32-bit, 64-bit 19:24:40 Address sizes: 40 bits physical, 48 bits virtual 19:24:40 Byte Order: Little Endian 19:24:40 CPU(s): 4 19:24:40 On-line CPU(s) list: 0-3 19:24:40 Vendor ID: AuthenticAMD 19:24:40 Model name: AMD EPYC-Rome Processor 19:24:40 CPU family: 23 19:24:40 Model: 49 19:24:40 Thread(s) per core: 1 19:24:40 Core(s) per socket: 1 19:24:40 Socket(s): 4 19:24:40 Stepping: 0 19:24:40 BogoMIPS: 5599.99 19:24:40 Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy svm cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase tsc_adjust bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 clzero xsaveerptr wbnoinvd arat npt nrip_save umip rdpid arch_capabilities 19:24:40 Virtualization: AMD-V 19:24:40 Hypervisor vendor: KVM 19:24:40 Virtualization type: full 19:24:40 L1d cache: 128 KiB (4 instances) 19:24:40 L1i cache: 128 KiB (4 instances) 19:24:40 L2 cache: 2 MiB (4 instances) 19:24:40 L3 cache: 64 MiB (4 instances) 19:24:40 NUMA node(s): 1 19:24:40 NUMA node0 CPU(s): 0-3 19:24:40 Vulnerability Gather data sampling: Not affected 19:24:40 Vulnerability Indirect target selection: Not affected 19:24:40 Vulnerability Itlb multihit: Not affected 19:24:40 Vulnerability L1tf: Not affected 19:24:40 Vulnerability Mds: Not affected 19:24:40 Vulnerability Meltdown: Not affected 19:24:40 Vulnerability Mmio stale data: Not affected 19:24:40 Vulnerability Reg file data sampling: Not affected 19:24:40 Vulnerability Retbleed: Mitigation; untrained return thunk; SMT disabled 19:24:40 Vulnerability Spec rstack overflow: Mitigation; SMT disabled 19:24:40 Vulnerability Spec store bypass: Mitigation; Speculative Store Bypass disabled via prctl and seccomp 19:24:40 Vulnerability Spectre v1: Mitigation; usercopy/swapgs barriers and __user pointer sanitization 19:24:40 Vulnerability Spectre v2: Mitigation; Retpolines; IBPB conditional; STIBP disabled; RSB filling; PBRSB-eIBRS Not affected; BHI Not affected 19:24:40 Vulnerability Srbds: Not affected 19:24:40 Vulnerability Tsa: Not affected 19:24:40 Vulnerability Tsx async abort: Not affected 19:24:40 Vulnerability Vmscape: Not affected 19:24:40 19:24:40 19:24:40 ---> nproc: 19:24:40 4 19:24:40 19:24:40 19:24:40 ---> df -h: 19:24:40 Filesystem Size Used Avail Use% Mounted on 19:24:40 tmpfs 1.6G 1.1M 1.6G 1% /run 19:24:40 /dev/vda1 78G 18G 61G 23% / 19:24:40 tmpfs 7.9G 0 7.9G 0% /dev/shm 19:24:40 tmpfs 5.0M 0 5.0M 0% /run/lock 19:24:40 /dev/vda15 105M 6.1M 99M 6% /boot/efi 19:24:40 tmpfs 1.6G 4.0K 1.6G 1% /run/user/1001 19:24:40 19:24:40 19:24:40 ---> free -m: 19:24:40 total used free shared buff/cache available 19:24:40 Mem: 15989 685 10918 3 4385 14961 19:24:40 Swap: 1023 1 1022 19:24:40 19:24:40 19:24:40 ---> ip addr: 19:24:40 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 19:24:40 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 19:24:40 inet 127.0.0.1/8 scope host lo 19:24:40 valid_lft forever preferred_lft forever 19:24:40 inet6 ::1/128 scope host 19:24:40 valid_lft forever preferred_lft forever 19:24:40 2: ens3: mtu 1458 qdisc mq state UP group default qlen 1000 19:24:40 link/ether fa:16:3e:29:25:41 brd ff:ff:ff:ff:ff:ff 19:24:40 altname enp0s3 19:24:40 inet 10.30.170.202/23 metric 100 brd 10.30.171.255 scope global dynamic ens3 19:24:40 valid_lft 80201sec preferred_lft 80201sec 19:24:40 inet6 fe80::f816:3eff:fe29:2541/64 scope link 19:24:40 valid_lft forever preferred_lft forever 19:24:40 3: docker0: mtu 1458 qdisc noqueue state DOWN group default 19:24:40 link/ether a6:31:94:c0:9a:ac brd ff:ff:ff:ff:ff:ff 19:24:40 inet 10.250.0.254/24 brd 10.250.0.255 scope global docker0 19:24:40 valid_lft forever preferred_lft forever 19:24:40 19:24:40 19:24:40 ---> sar -b -r -n DEV: 19:24:40 Linux 5.15.0-164-generic (prd-ubuntu2204-docker-4c-16g-260) 01/14/26 _x86_64_ (4 CPU) 19:24:40 19:24:40 17:41:23 LINUX RESTART (4 CPU) 19:24:40 19:24:40 17:50:01 tps rtps wtps dtps bread/s bwrtn/s bdscd/s 19:24:40 18:00:04 76.10 23.06 49.88 3.16 316.36 5289.02 2154.59 19:24:40 18:10:01 5.61 0.09 5.20 0.32 2.29 190.38 422.22 19:24:40 18:20:04 7.71 0.05 7.31 0.35 2.83 217.04 345.42 19:24:40 18:30:04 15.96 0.11 15.17 0.68 3.19 737.48 691.19 19:24:40 18:40:01 14.28 0.01 13.69 0.58 0.64 445.69 480.62 19:24:40 18:50:03 5.52 0.00 5.35 0.17 0.16 143.85 80.99 19:24:40 19:00:04 4.78 0.01 4.59 0.17 1.52 145.25 161.36 19:24:40 19:10:01 21.82 0.12 5.66 16.05 2.21 160.22 212803.59 19:24:40 19:20:04 9.69 0.01 9.31 0.38 0.32 554.80 151.31 19:24:40 Average: 17.96 2.62 12.92 2.42 36.76 878.36 24005.57 19:24:40 19:24:40 17:50:01 kbmemfree kbavail kbmemused %memused kbbuffers kbcached kbcommit %commit kbactive kbinact kbdirty 19:24:40 18:00:04 7685392 11001400 4934604 30.14 260944 3049424 6683864 38.37 2203552 5931268 3008 19:24:40 18:10:01 7657372 10985548 4952028 30.24 262048 3060556 5653208 32.45 2231296 5933324 280 19:24:40 18:20:04 6385972 9731384 6205428 37.90 263000 3076844 6955608 39.93 2238484 7198540 244 19:24:40 18:30:04 10242320 13718460 2220740 13.56 268340 3202204 2968080 17.04 2276836 3309380 280 19:24:40 18:40:01 7885924 11426752 4511176 27.55 270564 3264668 5249376 30.13 2281168 5662036 252 19:24:40 18:50:03 7881540 11440356 4497400 27.47 271356 3281864 5205676 29.88 2282624 5664032 276 19:24:40 19:00:04 6311708 9891568 6045312 36.92 271888 3302376 6806056 39.07 2284620 7220880 384 19:24:40 19:10:01 6298128 9905720 6031116 36.84 274668 3327336 6798856 39.03 2294768 7230012 348 19:24:40 19:20:04 7548928 11276424 4661204 28.47 278532 3443300 5295372 30.40 2302644 5940632 616 19:24:40 Average: 7544143 11041957 4895445 29.90 269038 3223175 5735122 32.92 2266221 6010012 632 19:24:40 19:24:40 17:50:01 IFACE rxpck/s txpck/s rxkB/s txkB/s rxcmp/s txcmp/s rxmcst/s %ifutil 19:24:40 18:00:04 lo 572.64 572.64 296.93 296.93 0.00 0.00 0.00 0.00 19:24:40 18:00:04 ens3 2.01 1.92 0.63 0.60 0.00 0.00 0.00 0.00 19:24:40 18:00:04 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 19:24:40 18:10:01 lo 224.95 224.95 126.81 126.81 0.00 0.00 0.00 0.00 19:24:40 18:10:01 ens3 1.48 1.36 0.30 1.36 0.00 0.00 0.00 0.00 19:24:40 18:10:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 19:24:40 18:20:04 lo 9.78 9.78 6.16 6.16 0.00 0.00 0.00 0.00 19:24:40 18:20:04 ens3 0.56 0.36 0.15 0.10 0.00 0.00 0.00 0.00 19:24:40 18:20:04 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 19:24:40 18:30:04 lo 9.52 9.52 4.70 4.70 0.00 0.00 0.00 0.00 19:24:40 18:30:04 ens3 0.97 0.75 0.29 0.22 0.00 0.00 0.00 0.00 19:24:40 18:30:04 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 19:24:40 18:40:01 lo 19.27 19.27 10.00 10.00 0.00 0.00 0.00 0.00 19:24:40 18:40:01 ens3 0.94 0.71 0.23 0.18 0.00 0.00 0.00 0.00 19:24:40 18:40:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 19:24:40 18:50:03 lo 21.84 21.84 8.92 8.92 0.00 0.00 0.00 0.00 19:24:40 18:50:03 ens3 0.78 0.56 0.20 0.15 0.00 0.00 0.00 0.00 19:24:40 18:50:03 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 19:24:40 19:00:04 lo 25.62 25.62 11.02 11.02 0.00 0.00 0.00 0.00 19:24:40 19:00:04 ens3 0.55 0.44 0.12 0.09 0.00 0.00 0.00 0.00 19:24:40 19:00:04 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 19:24:40 19:10:01 lo 17.36 17.36 10.43 10.43 0.00 0.00 0.00 0.00 19:24:40 19:10:01 ens3 0.71 0.54 0.19 0.14 0.00 0.00 0.00 0.00 19:24:40 19:10:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 19:24:40 19:20:04 lo 159.56 159.56 69.75 69.75 0.00 0.00 0.00 0.00 19:24:40 19:20:04 ens3 0.68 0.60 0.20 0.16 0.00 0.00 0.00 0.00 19:24:40 19:20:04 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 19:24:40 Average: lo 118.05 118.05 60.62 60.62 0.00 0.00 0.00 0.00 19:24:40 Average: ens3 0.96 0.81 0.26 0.33 0.00 0.00 0.00 0.00 19:24:40 Average: docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 19:24:40 19:24:40 19:24:40 ---> sar -P ALL: 19:24:40 Linux 5.15.0-164-generic (prd-ubuntu2204-docker-4c-16g-260) 01/14/26 _x86_64_ (4 CPU) 19:24:40 19:24:40 17:41:23 LINUX RESTART (4 CPU) 19:24:40 19:24:40 17:50:01 CPU %user %nice %system %iowait %steal %idle 19:24:40 18:00:04 all 67.35 0.00 2.91 0.14 0.13 29.47 19:24:40 18:00:04 0 68.72 0.00 2.92 0.07 0.13 28.17 19:24:40 18:00:04 1 67.53 0.00 2.97 0.11 0.14 29.25 19:24:40 18:00:04 2 66.88 0.00 2.93 0.20 0.13 29.86 19:24:40 18:00:04 3 66.27 0.00 2.81 0.17 0.12 30.62 19:24:40 18:10:01 all 17.32 0.00 1.07 0.07 0.13 81.42 19:24:40 18:10:01 0 18.44 0.00 1.07 0.02 0.12 80.34 19:24:40 18:10:01 1 16.45 0.00 1.15 0.10 0.13 82.18 19:24:40 18:10:01 2 17.90 0.00 1.07 0.01 0.13 80.88 19:24:40 18:10:01 3 16.47 0.00 0.99 0.15 0.12 82.27 19:24:40 18:20:04 all 15.58 0.00 0.73 0.06 0.13 83.51 19:24:40 18:20:04 0 15.76 0.00 0.60 0.03 0.11 83.49 19:24:40 18:20:04 1 15.19 0.00 0.64 0.07 0.12 83.97 19:24:40 18:20:04 2 15.44 0.00 0.85 0.01 0.13 83.57 19:24:40 18:20:04 3 15.91 0.00 0.83 0.12 0.13 83.01 19:24:40 18:30:04 all 21.49 0.00 0.87 0.10 0.12 77.42 19:24:40 18:30:04 0 21.10 0.00 0.88 0.03 0.12 77.87 19:24:40 18:30:04 1 21.21 0.00 0.95 0.26 0.12 77.46 19:24:40 18:30:04 2 21.89 0.00 0.72 0.04 0.12 77.24 19:24:40 18:30:04 3 21.75 0.00 0.93 0.08 0.12 77.13 19:24:40 18:40:01 all 25.06 0.00 0.99 0.10 0.12 73.72 19:24:40 18:40:01 0 24.51 0.00 0.91 0.07 0.12 74.40 19:24:40 18:40:01 1 24.23 0.00 1.16 0.16 0.13 74.33 19:24:40 18:40:01 2 25.88 0.00 1.03 0.09 0.12 72.88 19:24:40 18:40:01 3 25.63 0.00 0.89 0.10 0.12 73.27 19:24:40 18:50:03 all 8.35 0.00 0.48 0.04 0.12 91.01 19:24:40 18:50:03 0 8.45 0.00 0.44 0.03 0.11 90.96 19:24:40 18:50:03 1 8.74 0.00 0.48 0.03 0.12 90.64 19:24:40 18:50:03 2 7.80 0.00 0.48 0.02 0.12 91.58 19:24:40 18:50:03 3 8.40 0.00 0.53 0.09 0.12 90.86 19:24:40 19:00:04 all 10.36 0.00 0.52 0.03 0.12 88.98 19:24:40 19:00:04 0 10.32 0.00 0.49 0.07 0.10 89.01 19:24:40 19:00:04 1 9.97 0.00 0.53 0.00 0.12 89.38 19:24:40 19:00:04 2 10.70 0.00 0.50 0.02 0.12 88.66 19:24:40 19:00:04 3 10.46 0.00 0.54 0.03 0.12 88.85 19:24:40 19:10:01 all 9.63 0.00 0.44 0.05 0.12 89.75 19:24:40 19:10:01 0 9.28 0.00 0.46 0.12 0.11 90.03 19:24:40 19:10:01 1 9.90 0.00 0.46 0.02 0.13 89.50 19:24:40 19:10:01 2 9.64 0.00 0.49 0.05 0.12 89.70 19:24:40 19:10:01 3 9.72 0.00 0.36 0.02 0.13 89.78 19:24:40 19:20:04 all 16.26 0.00 0.83 0.04 0.13 82.74 19:24:40 19:20:04 0 17.97 0.00 0.82 0.02 0.12 81.07 19:24:40 19:20:04 1 15.06 0.00 0.91 0.01 0.14 83.88 19:24:40 19:20:04 2 15.87 0.00 0.80 0.09 0.12 83.13 19:24:40 19:20:04 3 16.14 0.00 0.78 0.06 0.13 82.89 19:24:40 Average: all 21.27 0.00 0.98 0.07 0.12 77.56 19:24:40 Average: 0 21.62 0.00 0.96 0.05 0.12 77.25 19:24:40 Average: 1 20.92 0.00 1.03 0.08 0.13 77.84 19:24:40 Average: 2 21.33 0.00 0.99 0.06 0.12 77.51 19:24:40 Average: 3 21.20 0.00 0.96 0.09 0.12 77.62 19:24:40 19:24:40 19:24:40