docs: install_deps> python -I -m pip install -r docs/requirements.txt checkbashisms: freeze> python -m pip freeze --all buildcontroller: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt docs-linkcheck: install_deps> python -I -m pip install -r docs/requirements.txt checkbashisms: pip==24.1,setuptools==70.1.0,wheel==0.43.0 checkbashisms: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./fixCIcentOS8reposMirrors.sh checkbashisms: commands[1] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sh -c 'command checkbashisms>/dev/null || sudo yum install -y devscripts-checkbashisms || sudo yum install -y devscripts-minimal || sudo yum install -y devscripts || sudo yum install -y https://archives.fedoraproject.org/pub/archive/fedora/linux/releases/31/Everything/x86_64/os/Packages/d/devscripts-checkbashisms-2.19.6-2.fc31.x86_64.rpm || (echo "checkbashisms command not found - please install it (e.g. sudo apt-get install devscripts | yum install devscripts-minimal )" >&2 && exit 1)' checkbashisms: commands[2] /w/workspace/transportpce-tox-verify-transportpce-master/tests> find . -not -path '*/\.*' -name '*.sh' -exec checkbashisms -f '{}' + checkbashisms: OK ✔ in 2.59 seconds pre-commit: install_deps> python -I -m pip install pre-commit pre-commit: freeze> python -m pip freeze --all pre-commit: cfgv==3.4.0,distlib==0.3.8,filelock==3.15.4,identify==2.6.0,nodeenv==1.9.1,pip==24.1,platformdirs==4.2.2,pre-commit==3.8.0,PyYAML==6.0.2,setuptools==70.1.0,virtualenv==20.26.3,wheel==0.43.0 pre-commit: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./fixCIcentOS8reposMirrors.sh pre-commit: commands[1] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sh -c 'which cpan || sudo yum install -y perl-CPAN || (echo "cpan command not found - please install it (e.g. sudo apt-get install perl-modules | yum install perl-CPAN )" >&2 && exit 1)' /usr/bin/cpan pre-commit: commands[2] /w/workspace/transportpce-tox-verify-transportpce-master/tests> pre-commit run --all-files --show-diff-on-failure [INFO] Initializing environment for https://github.com/pre-commit/pre-commit-hooks. [INFO] Initializing environment for https://github.com/jorisroovers/gitlint. [INFO] Initializing environment for https://github.com/jorisroovers/gitlint:./gitlint-core[trusted-deps]. [INFO] Initializing environment for https://github.com/Lucas-C/pre-commit-hooks. buildcontroller: freeze> python -m pip freeze --all [INFO] Initializing environment for https://github.com/pre-commit/mirrors-autopep8. [INFO] Initializing environment for https://github.com/perltidy/perltidy. buildcontroller: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.8,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.4.1,pip==24.1,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.2,requests==2.32.3,setuptools==70.1.0,urllib3==2.2.2,wheel==0.43.0 buildcontroller: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_controller.sh java-1.11.0-openjdk-amd64 1111 /usr/lib/jvm/java-1.11.0-openjdk-amd64 java-1.12.0-openjdk-amd64 1211 /usr/lib/jvm/java-1.12.0-openjdk-amd64 java-1.17.0-openjdk-amd64 1711 /usr/lib/jvm/java-1.17.0-openjdk-amd64 java-1.21.0-openjdk-amd64 2111 /usr/lib/jvm/java-1.21.0-openjdk-amd64 java-1.8.0-openjdk-amd64 1081 /usr/lib/jvm/java-1.8.0-openjdk-amd64 [INFO] Installing environment for https://github.com/pre-commit/pre-commit-hooks. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... 21 21 ok, java is 21 or newer Apache Maven 3.9.8 (36645f6c9b5079805ea5009217e36f2cffd34256) Maven home: /opt/maven Java version: 21.0.4, vendor: Ubuntu, runtime: /usr/lib/jvm/java-21-openjdk-amd64 Default locale: en, platform encoding: UTF-8 OS name: "linux", version: "5.4.0-190-generic", arch: "amd64", family: "unix" [INFO] Installing environment for https://github.com/Lucas-C/pre-commit-hooks. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... [INFO] Installing environment for https://github.com/pre-commit/mirrors-autopep8. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... [INFO] Installing environment for https://github.com/perltidy/perltidy. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... docs: freeze> python -m pip freeze --all docs: alabaster==0.7.16,attrs==24.2.0,babel==2.16.0,blockdiag==3.0.0,certifi==2024.8.30,charset-normalizer==3.3.2,contourpy==1.3.0,cycler==0.12.1,docutils==0.20.1,fonttools==4.53.1,funcparserlib==2.0.0a0,future==1.0.0,idna==3.8,imagesize==1.4.1,Jinja2==3.1.4,jsonschema==3.2.0,kiwisolver==1.4.7,lfdocs-conf==0.9.0,MarkupSafe==2.1.5,matplotlib==3.9.2,numpy==2.1.1,nwdiag==3.0.0,packaging==24.1,pillow==10.4.0,pip==24.1,Pygments==2.18.0,pyparsing==3.1.4,pyrsistent==0.20.0,python-dateutil==2.9.0.post0,PyYAML==6.0.2,requests==2.32.3,requests-file==1.5.1,seqdiag==3.0.0,setuptools==70.1.0,six==1.16.0,snowballstemmer==2.2.0,Sphinx==7.4.7,sphinx-bootstrap-theme==0.8.1,sphinx-data-viewer==0.1.5,sphinx-rtd-theme==2.0.0,sphinx-tabs==3.4.5,sphinxcontrib-applehelp==2.0.0,sphinxcontrib-blockdiag==3.0.0,sphinxcontrib-devhelp==2.0.0,sphinxcontrib-htmlhelp==2.1.0,sphinxcontrib-jquery==4.1,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-needs==0.7.9,sphinxcontrib-nwdiag==2.0.0,sphinxcontrib-plantuml==0.30,sphinxcontrib-qthelp==2.0.0,sphinxcontrib-seqdiag==3.0.0,sphinxcontrib-serializinghtml==2.0.0,sphinxcontrib-swaggerdoc==0.1.7,urllib3==2.2.2,webcolors==24.8.0,wheel==0.43.0 docs: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sphinx-build -q -W --keep-going -b html -n -d /w/workspace/transportpce-tox-verify-transportpce-master/.tox/docs/tmp/doctrees ../docs/ /w/workspace/transportpce-tox-verify-transportpce-master/docs/_build/html trim trailing whitespace.................................................Passed Tabs remover.............................................................docs-linkcheck: freeze> python -m pip freeze --all Passed autopep8.................................................................docs-linkcheck: alabaster==0.7.16,attrs==24.2.0,babel==2.16.0,blockdiag==3.0.0,certifi==2024.8.30,charset-normalizer==3.3.2,contourpy==1.3.0,cycler==0.12.1,docutils==0.20.1,fonttools==4.53.1,funcparserlib==2.0.0a0,future==1.0.0,idna==3.8,imagesize==1.4.1,Jinja2==3.1.4,jsonschema==3.2.0,kiwisolver==1.4.7,lfdocs-conf==0.9.0,MarkupSafe==2.1.5,matplotlib==3.9.2,numpy==2.1.1,nwdiag==3.0.0,packaging==24.1,pillow==10.4.0,pip==24.1,Pygments==2.18.0,pyparsing==3.1.4,pyrsistent==0.20.0,python-dateutil==2.9.0.post0,PyYAML==6.0.2,requests==2.32.3,requests-file==1.5.1,seqdiag==3.0.0,setuptools==70.1.0,six==1.16.0,snowballstemmer==2.2.0,Sphinx==7.4.7,sphinx-bootstrap-theme==0.8.1,sphinx-data-viewer==0.1.5,sphinx-rtd-theme==2.0.0,sphinx-tabs==3.4.5,sphinxcontrib-applehelp==2.0.0,sphinxcontrib-blockdiag==3.0.0,sphinxcontrib-devhelp==2.0.0,sphinxcontrib-htmlhelp==2.1.0,sphinxcontrib-jquery==4.1,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-needs==0.7.9,sphinxcontrib-nwdiag==2.0.0,sphinxcontrib-plantuml==0.30,sphinxcontrib-qthelp==2.0.0,sphinxcontrib-seqdiag==3.0.0,sphinxcontrib-serializinghtml==2.0.0,sphinxcontrib-swaggerdoc==0.1.7,urllib3==2.2.2,webcolors==24.8.0,wheel==0.43.0 docs-linkcheck: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sphinx-build -q -b linkcheck -d /w/workspace/transportpce-tox-verify-transportpce-master/.tox/docs-linkcheck/tmp/doctrees ../docs/ /w/workspace/transportpce-tox-verify-transportpce-master/docs/_build/linkcheck docs: OK ✔ in 41.97 seconds pylint: install_deps> python -I -m pip install 'pylint>=2.6.0' Passed perltidy.................................................................Passed docs-linkcheck: OK ✔ in 43.79 seconds pre-commit: commands[3] /w/workspace/transportpce-tox-verify-transportpce-master/tests> pre-commit run gitlint-ci --hook-stage manual [INFO] Installing environment for https://github.com/jorisroovers/gitlint. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... pylint: freeze> python -m pip freeze --all pylint: astroid==3.2.4,dill==0.3.8,isort==5.13.2,mccabe==0.7.0,pip==24.1,platformdirs==4.2.2,pylint==3.2.7,setuptools==70.1.0,tomlkit==0.13.2,wheel==0.43.0 pylint: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> find transportpce_tests/ -name '*.py' -exec pylint --fail-under=10 --max-line-length=120 --disable=missing-docstring,import-error --disable=fixme --disable=duplicate-code '--module-rgx=([a-z0-9_]+$)|([0-9.]{1,30}$)' '--method-rgx=(([a-z_][a-zA-Z0-9_]{2,})|(_[a-z0-9_]*)|(__[a-zA-Z][a-zA-Z0-9_]+__))$' '--variable-rgx=[a-zA-Z_][a-zA-Z0-9_]{1,30}$' '{}' + gitlint..................................................................Passed ------------------------------------ Your code has been rated at 10.00/10 pre-commit: OK ✔ in 52.51 seconds pylint: OK ✔ in 25.77 seconds buildcontroller: OK ✔ in 2 minutes 5.31 seconds sims: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt build_karaf_tests121: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt testsPCE: install_deps> python -I -m pip install gnpy4tpce==2.4.7 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt build_karaf_tests221: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt build_karaf_tests221: freeze> python -m pip freeze --all sims: freeze> python -m pip freeze --all build_karaf_tests121: freeze> python -m pip freeze --all build_karaf_tests221: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.8,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.4.1,pip==24.1,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.2,requests==2.32.3,setuptools==70.1.0,urllib3==2.2.2,wheel==0.43.0 build_karaf_tests221: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh sims: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.8,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.4.1,pip==24.1,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.2,requests==2.32.3,setuptools==70.1.0,urllib3==2.2.2,wheel==0.43.0 sims: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./install_lightynode.sh Using lighynode version 19.1.0.5 Installing lightynode device to ./lightynode/lightynode-openroadm-device directory build_karaf_tests121: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.8,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.4.1,pip==24.1,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.2,requests==2.32.3,setuptools==70.1.0,urllib3==2.2.2,wheel==0.43.0 build_karaf_tests121: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh sims: OK ✔ in 10.83 seconds build_karaf_tests71: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt build_karaf_tests71: freeze> python -m pip freeze --all build_karaf_tests71: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.8,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.4.1,pip==24.1,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.2,requests==2.32.3,setuptools==70.1.0,urllib3==2.2.2,wheel==0.43.0 build_karaf_tests71: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh build_karaf_tests121: OK ✔ in 1 minute 1.85 seconds build_karaf_tests_hybrid: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt build_karaf_tests221: OK ✔ in 1 minute 5.43 seconds tests_tapi: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt build_karaf_tests_hybrid: freeze> python -m pip freeze --all build_karaf_tests_hybrid: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.8,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.4.1,pip==24.1,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.2,requests==2.32.3,setuptools==70.1.0,urllib3==2.2.2,wheel==0.43.0 build_karaf_tests_hybrid: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh tests_tapi: freeze> python -m pip freeze --all tests_tapi: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.8,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.4.1,pip==24.1,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.2,requests==2.32.3,setuptools==70.1.0,urllib3==2.2.2,wheel==0.43.0 tests_tapi: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh tapi using environment variables from ./karaf221.env pytest -q transportpce_tests/tapi/test01_abstracted_topology.py build_karaf_tests71: OK ✔ in 1 minute 36.38 seconds testsPCE: freeze> python -m pip freeze --all testsPCE: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,click==8.1.7,contourpy==1.3.0,cryptography==3.3.2,cycler==0.12.1,dict2xml==1.7.6,Flask==2.1.3,Flask-Injector==0.14.0,fonttools==4.53.1,gnpy4tpce==2.4.7,idna==3.8,iniconfig==2.0.0,injector==0.22.0,itsdangerous==2.2.0,Jinja2==3.1.4,kiwisolver==1.4.7,lxml==5.3.0,MarkupSafe==2.1.5,matplotlib==3.9.2,netconf-client==3.1.1,networkx==2.8.8,numpy==1.26.4,packaging==24.1,pandas==1.5.3,paramiko==3.4.1,pbr==5.11.1,pillow==10.4.0,pip==24.1,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pyparsing==3.1.4,pytest==8.3.2,python-dateutil==2.9.0.post0,pytz==2024.1,requests==2.32.3,scipy==1.14.1,setuptools==50.3.2,six==1.16.0,urllib3==2.2.2,Werkzeug==2.0.3,wheel==0.43.0,xlrd==1.2.0 testsPCE: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh pce pytest -q transportpce_tests/pce/test01_pce.py FFFFFFFFFFFFFFFFFFFFE [100%] ==================================== ERRORS ==================================== _ ERROR at teardown of TransportPCEtesting.test_20_path_computation_after_oms_attribute_deletion _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8181), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'DELETE', url = '/rests/data/transportpce-portmapping:network' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '0', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:398: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:236: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:211: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'DELETE', url = '/rests/data/transportpce-portmapping:network' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/transportpce-portmapping:network (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: cls = @classmethod def tearDownClass(cls): # clean datastores > test_utils.del_portmapping() transportpce_tests/pce/test01_pce.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:454: in del_portmapping response = delete_request(url[RESTCONF_VERSION].format('{}')) transportpce_tests/common/test_utils.py:133: in delete_request return requests.request( ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/transportpce-portmapping:network (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError =================================== FAILURES =================================== ________________ TransportPCEtesting.test_01_load_port_mapping _________________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8181), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'POST', url = '/rests/data/transportpce-portmapping:network' body = '{"nodes": [{"node-id": "OpenROADM-1-1", "node-info": {"node-type": "rdm", "openroadm-version": "1.2.1", "node-clli": ..., "node-ip-address": "1.2.3.4"}, "mapping": [], "mc-capabilities": [], "cp-to-degree": [], "switching-pool-lcp": []}]}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '12976', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:398: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:236: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:211: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST', url = '/rests/data/transportpce-portmapping:network' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/transportpce-portmapping:network (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_01_load_port_mapping(self): > response = test_utils.post_portmapping(self.port_mapping_data) transportpce_tests/pce/test01_pce.py:111: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:447: in post_portmapping response = post_request(url[RESTCONF_VERSION].format('{}'), json_payload) transportpce_tests/common/test_utils.py:142: in post_request return requests.request( ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/transportpce-portmapping:network (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ---------------------------- Captured stdout setup ----------------------------- sample files content loaded starting OpenDaylight... starting KARAF TransportPCE build... Searching for pattern 'Transportpce controller started' in karaf.log... Pattern found! OpenDaylight started ! _____________ TransportPCEtesting.test_02_load_simple_topology_bi ______________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8181), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'PUT' url = '/rests/data/ietf-network:networks/network=openroadm-topology' body = '{"ietf-network:network": [{"network-id": "openroadm-topology", "network-types": {"org-openroadm-common-network:openro...ber-type": "smf", "pmd": "0.50", "SRLG-length": "100000.0"}], "auto-spanloss": true, "spanloss-current": "12.0"}}}]}]}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '26810', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/ietf-network:networks/network=openroadm-topology', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:398: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:236: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:211: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'PUT' url = '/rests/data/ietf-network:networks/network=openroadm-topology' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_02_load_simple_topology_bi(self): > response = test_utils.put_ietf_network('openroadm-topology', self.simple_topo_bi_dir_data) transportpce_tests/pce/test01_pce.py:117: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:511: in put_ietf_network response = put_request(url[RESTCONF_VERSION].format('{}', network), json_payload) transportpce_tests/common/test_utils.py:124: in put_request return requests.request( ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ____________________ TransportPCEtesting.test_03_get_nodeId ____________________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8181), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'GET' url = '/rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-SRG1?content=config' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-SRG1', query='content=config', fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:398: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:236: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:211: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET' url = '/rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-SRG1?content=config' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-SRG1?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_03_get_nodeId(self): > response = test_utils.get_ietf_network_node_request('openroadm-topology', 'ROADMA01-SRG1', 'config') transportpce_tests/pce/test01_pce.py:123: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:580: in get_ietf_network_node_request response = get_request(url[RESTCONF_VERSION].format(*format_args)) transportpce_tests/common/test_utils.py:116: in get_request return requests.request( ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/node=ROADMA01-SRG1?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ____________________ TransportPCEtesting.test_04_get_linkId ____________________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8181), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'GET' url = '/rests/data/ietf-network:networks/network=openroadm-topology/ietf-network-topology:link=XPDRA01-XPDR1-XPDR1-NETWORK1toROADMA01-SRG1-SRG1-PP1-TXRX?content=config' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/ietf-network:networks/network=openroadm-topology/i...etwork-topology:link=XPDRA01-XPDR1-XPDR1-NETWORK1toROADMA01-SRG1-SRG1-PP1-TXRX', query='content=config', fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:398: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:236: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:211: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET' url = '/rests/data/ietf-network:networks/network=openroadm-topology/ietf-network-topology:link=XPDRA01-XPDR1-XPDR1-NETWORK1toROADMA01-SRG1-SRG1-PP1-TXRX?content=config' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/ietf-network-topology:link=XPDRA01-XPDR1-XPDR1-NETWORK1toROADMA01-SRG1-SRG1-PP1-TXRX?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_04_get_linkId(self): > response = test_utils.get_ietf_network_link_request( 'openroadm-topology', 'XPDRA01-XPDR1-XPDR1-NETWORK1toROADMA01-SRG1-SRG1-PP1-TXRX', 'config') transportpce_tests/pce/test01_pce.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:531: in get_ietf_network_link_request response = get_request(url[RESTCONF_VERSION].format(*format_args)) transportpce_tests/common/test_utils.py:116: in get_request return requests.request( ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/ietf-network-topology:link=XPDRA01-XPDR1-XPDR1-NETWORK1toROADMA01-SRG1-SRG1-PP1-TXRX?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError _____________ TransportPCEtesting.test_05_path_computation_xpdr_bi _____________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8181), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'POST' url = '/rests/operations/transportpce-pce:path-computation-request' body = '{"input": {"service-name": "service-1", "resource-reserve": "true", "service-handler-header": {"request-id": "request...ate": "100", "clli": "NodeC", "service-format": "Ethernet", "node-id": "XPDRC01"}, "pce-routing-metric": "hop-count"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '379', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-pce:path-computation-request', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:398: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:236: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:211: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST' url = '/rests/operations/transportpce-pce:path-computation-request' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_05_path_computation_xpdr_bi(self): > response = test_utils.transportpce_api_rpc_request('transportpce-pce', 'path-computation-request', self.path_computation_input_data) transportpce_tests/pce/test01_pce.py:138: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:684: in transportpce_api_rpc_request response = post_request(url, data) transportpce_tests/common/test_utils.py:142: in post_request return requests.request( ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError _____________ TransportPCEtesting.test_06_path_computation_rdm_bi ______________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8181), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'POST' url = '/rests/operations/transportpce-pce:path-computation-request' body = '{"input": {"service-name": "service-1", "resource-reserve": "true", "service-handler-header": {"request-id": "request...te": "100", "clli": "NodeC", "service-format": "Ethernet", "node-id": "ROADMC01"}, "pce-routing-metric": "hop-count"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '381', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-pce:path-computation-request', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:398: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:236: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:211: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST' url = '/rests/operations/transportpce-pce:path-computation-request' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_06_path_computation_rdm_bi(self): self.path_computation_input_data["service-a-end"]["node-id"] = "ROADMA01" self.path_computation_input_data["service-z-end"]["node-id"] = "ROADMC01" > response = test_utils.transportpce_api_rpc_request('transportpce-pce', 'path-computation-request', self.path_computation_input_data) transportpce_tests/pce/test01_pce.py:150: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:684: in transportpce_api_rpc_request response = post_request(url, data) transportpce_tests/common/test_utils.py:142: in post_request return requests.request( ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError _____________ TransportPCEtesting.test_07_load_simple_topology_uni _____________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8181), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'PUT' url = '/rests/data/ietf-network:networks/network=openroadm-topology' body = '{"ietf-network:network": [{"network-id": "openroadm-topology", "ietf-network-topology:link": [{"link-id": "OpenROADM-...-common-network:operational-state": "inService", "org-openroadm-common-network:administrative-state": "inService"}]}]}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...keep-alive', 'Content-Type': 'application/json', 'Content-Length': '266452', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/ietf-network:networks/network=openroadm-topology', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:398: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:236: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:211: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'PUT' url = '/rests/data/ietf-network:networks/network=openroadm-topology' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_07_load_simple_topology_uni(self): > response = test_utils.put_ietf_network('openroadm-topology', self.simple_topo_uni_dir_data) transportpce_tests/pce/test01_pce.py:160: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:511: in put_ietf_network response = put_request(url[RESTCONF_VERSION].format('{}', network), json_payload) transportpce_tests/common/test_utils.py:124: in put_request return requests.request( ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ____________________ TransportPCEtesting.test_08_get_nodeId ____________________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8181), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'GET' url = '/rests/data/ietf-network:networks/network=openroadm-topology/node=XPONDER-1-2?content=config' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/ietf-network:networks/network=openroadm-topology/node=XPONDER-1-2', query='content=config', fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:398: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:236: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:211: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET' url = '/rests/data/ietf-network:networks/network=openroadm-topology/node=XPONDER-1-2?content=config' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/node=XPONDER-1-2?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_08_get_nodeId(self): > response = test_utils.get_ietf_network_node_request('openroadm-topology', 'XPONDER-1-2', 'config') transportpce_tests/pce/test01_pce.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:580: in get_ietf_network_node_request response = get_request(url[RESTCONF_VERSION].format(*format_args)) transportpce_tests/common/test_utils.py:116: in get_request return requests.request( ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/node=XPONDER-1-2?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ____________________ TransportPCEtesting.test_09_get_linkId ____________________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8181), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'GET' url = '/rests/data/ietf-network:networks/network=openroadm-topology/ietf-network-topology:link=XPONDER-1-2XPDR-NW1-TX-toOpenROADM-1-2-SRG1-SRG1-PP1-RX?content=config' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/ietf-network:networks/network=openroadm-topology/ietf-network-topology:link=XPONDER-1-2XPDR-NW1-TX-toOpenROADM-1-2-SRG1-SRG1-PP1-RX', query='content=config', fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:398: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:236: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:211: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET' url = '/rests/data/ietf-network:networks/network=openroadm-topology/ietf-network-topology:link=XPONDER-1-2XPDR-NW1-TX-toOpenROADM-1-2-SRG1-SRG1-PP1-RX?content=config' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/ietf-network-topology:link=XPONDER-1-2XPDR-NW1-TX-toOpenROADM-1-2-SRG1-SRG1-PP1-RX?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_09_get_linkId(self): > response = test_utils.get_ietf_network_link_request( 'openroadm-topology', 'XPONDER-1-2XPDR-NW1-TX-toOpenROADM-1-2-SRG1-SRG1-PP1-RX', 'config') transportpce_tests/pce/test01_pce.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:531: in get_ietf_network_link_request response = get_request(url[RESTCONF_VERSION].format(*format_args)) transportpce_tests/common/test_utils.py:116: in get_request return requests.request( ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/ietf-network-topology:link=XPONDER-1-2XPDR-NW1-TX-toOpenROADM-1-2-SRG1-SRG1-PP1-RX?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ____________ TransportPCEtesting.test_10_path_computation_xpdr_uni _____________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8181), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'POST' url = '/rests/operations/transportpce-pce:path-computation-request' body = '{"input": {"service-name": "service-1", "resource-reserve": "true", "service-handler-header": {"request-id": "request..."100", "clli": "ORANGE3", "service-format": "Ethernet", "node-id": "XPONDER-3-2"}, "pce-routing-metric": "hop-count"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '391', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-pce:path-computation-request', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:398: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:236: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:211: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST' url = '/rests/operations/transportpce-pce:path-computation-request' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_10_path_computation_xpdr_uni(self): self.path_computation_input_data["service-a-end"]["node-id"] = "XPONDER-1-2" self.path_computation_input_data["service-a-end"]["clli"] = "ORANGE1" self.path_computation_input_data["service-z-end"]["node-id"] = "XPONDER-3-2" self.path_computation_input_data["service-z-end"]["clli"] = "ORANGE3" > response = test_utils.transportpce_api_rpc_request('transportpce-pce', 'path-computation-request', self.path_computation_input_data) transportpce_tests/pce/test01_pce.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:684: in transportpce_api_rpc_request response = post_request(url, data) transportpce_tests/common/test_utils.py:142: in post_request return requests.request( ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError _____________ TransportPCEtesting.test_11_path_computation_rdm_uni _____________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8181), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'POST' url = '/rests/operations/transportpce-pce:path-computation-request' body = '{"input": {"service-name": "service-1", "resource-reserve": "true", "service-handler-header": {"request-id": "request...100", "clli": "ncli22", "service-format": "Ethernet", "node-id": "OpenROADM-2-2"}, "pce-routing-metric": "hop-count"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '392', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-pce:path-computation-request', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:398: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:236: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:211: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST' url = '/rests/operations/transportpce-pce:path-computation-request' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_11_path_computation_rdm_uni(self): self.path_computation_input_data["service-a-end"]["node-id"] = "OpenROADM-2-1" self.path_computation_input_data["service-a-end"]["clli"] = "cll21" self.path_computation_input_data["service-z-end"]["node-id"] = "OpenROADM-2-2" self.path_computation_input_data["service-z-end"]["clli"] = "ncli22" > response = test_utils.transportpce_api_rpc_request('transportpce-pce', 'path-computation-request', self.path_computation_input_data) transportpce_tests/pce/test01_pce.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:684: in transportpce_api_rpc_request response = post_request(url, data) transportpce_tests/common/test_utils.py:142: in post_request return requests.request( ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ______________ TransportPCEtesting.test_12_load_complex_topology _______________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8181), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'PUT' url = '/rests/data/ietf-network:networks/network=openroadm-topology' body = '{"ietf-network:network": [{"network-id": "openroadm-topology", "ietf-network-topology:link": [{"link-id": "OpenROADM-...-common-network:operational-state": "inService", "org-openroadm-common-network:administrative-state": "inService"}]}]}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...keep-alive', 'Content-Type': 'application/json', 'Content-Length': '466906', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/ietf-network:networks/network=openroadm-topology', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:398: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:236: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:211: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'PUT' url = '/rests/data/ietf-network:networks/network=openroadm-topology' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_12_load_complex_topology(self): > response = test_utils.put_ietf_network('openroadm-topology', self.complex_topo_uni_dir_data) transportpce_tests/pce/test01_pce.py:221: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:511: in put_ietf_network response = put_request(url[RESTCONF_VERSION].format('{}', network), json_payload) transportpce_tests/common/test_utils.py:124: in put_request return requests.request( ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ____________________ TransportPCEtesting.test_13_get_nodeId ____________________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8181), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'GET' url = '/rests/data/ietf-network:networks/network=openroadm-topology/node=XPONDER-3-2?content=config' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/ietf-network:networks/network=openroadm-topology/node=XPONDER-3-2', query='content=config', fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:398: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:236: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:211: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET' url = '/rests/data/ietf-network:networks/network=openroadm-topology/node=XPONDER-3-2?content=config' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/node=XPONDER-3-2?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_13_get_nodeId(self): > response = test_utils.get_ietf_network_node_request('openroadm-topology', 'XPONDER-3-2', 'config') transportpce_tests/pce/test01_pce.py:227: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:580: in get_ietf_network_node_request response = get_request(url[RESTCONF_VERSION].format(*format_args)) transportpce_tests/common/test_utils.py:116: in get_request return requests.request( ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/node=XPONDER-3-2?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ______________ TransportPCEtesting.test_14_fail_path_computation _______________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8181), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'POST' url = '/rests/operations/transportpce-pce:path-computation-request' body = '{"input": {"resource-reserve": "true", "service-handler-header": {"request-id": "request1"}, "pce-routing-metric": "hop-count"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '128', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-pce:path-computation-request', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:398: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:236: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:211: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST' url = '/rests/operations/transportpce-pce:path-computation-request' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_14_fail_path_computation(self): del self.path_computation_input_data["service-name"] del self.path_computation_input_data["service-a-end"] del self.path_computation_input_data["service-z-end"] > response = test_utils.transportpce_api_rpc_request('transportpce-pce', 'path-computation-request', self.path_computation_input_data) transportpce_tests/pce/test01_pce.py:237: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:684: in transportpce_api_rpc_request response = post_request(url, data) transportpce_tests/common/test_utils.py:142: in post_request return requests.request( ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ____________ TransportPCEtesting.test_15_success1_path_computation _____________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8181), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'POST' url = '/rests/operations/transportpce-pce:path-computation-request' body = '{"input": {"resource-reserve": "true", "service-handler-header": {"request-id": "request1"}, "pce-routing-metric": "h..."Some customer-code"], "co-routing": {"service-identifier-list": [{"service-identifier": "Some existing-service"}]}}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '705', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-pce:path-computation-request', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:398: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:236: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:211: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST' url = '/rests/operations/transportpce-pce:path-computation-request' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_15_success1_path_computation(self): self.path_computation_input_data["service-name"] = "service 1" self.path_computation_input_data["service-a-end"] = {"service-format": "Ethernet", "service-rate": "100", "clli": "ORANGE2", "node-id": "XPONDER-2-2"} self.path_computation_input_data["service-z-end"] = {"service-format": "Ethernet", "service-rate": "100", "clli": "ORANGE1", "node-id": "XPONDER-1-2"} self.path_computation_input_data["hard-constraints"] = {"customer-code": ["Some customer-code"], "co-routing": { "service-identifier-list": [{ "service-identifier": "Some existing-service"}] }} self.path_computation_input_data["soft-constraints"] = {"customer-code": ["Some customer-code"], "co-routing": { "service-identifier-list": [{ "service-identifier": "Some existing-service"}] }} > response = test_utils.transportpce_api_rpc_request('transportpce-pce', 'path-computation-request', self.path_computation_input_data) transportpce_tests/pce/test01_pce.py:262: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:684: in transportpce_api_rpc_request response = post_request(url, data) transportpce_tests/common/test_utils.py:142: in post_request return requests.request( ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ____________ TransportPCEtesting.test_16_success2_path_computation _____________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8181), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'POST' url = '/rests/operations/transportpce-pce:path-computation-request' body = '{"input": {"resource-reserve": "true", "service-handler-header": {"request-id": "request1"}, "pce-routing-metric": "h... "service-z-end": {"service-format": "Ethernet", "service-rate": "100", "clli": "ORANGE3", "node-id": "XPONDER-3-2"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '391', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-pce:path-computation-request', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:398: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:236: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:211: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST' url = '/rests/operations/transportpce-pce:path-computation-request' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_16_success2_path_computation(self): self.path_computation_input_data["service-a-end"]["node-id"] = "XPONDER-1-2" self.path_computation_input_data["service-a-end"]["clli"] = "ORANGE1" self.path_computation_input_data["service-z-end"]["node-id"] = "XPONDER-3-2" self.path_computation_input_data["service-z-end"]["clli"] = "ORANGE3" del self.path_computation_input_data["hard-constraints"] del self.path_computation_input_data["soft-constraints"] > response = test_utils.transportpce_api_rpc_request('transportpce-pce', 'path-computation-request', self.path_computation_input_data) transportpce_tests/pce/test01_pce.py:279: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:684: in transportpce_api_rpc_request response = post_request(url, data) transportpce_tests/common/test_utils.py:142: in post_request return requests.request( ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError ____________ TransportPCEtesting.test_17_success3_path_computation _____________ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8181), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'POST' url = '/rests/operations/transportpce-pce:path-computation-request' body = '{"input": {"resource-reserve": "true", "service-handler-header": {"request-id": "request1"}, "pce-routing-metric": "h...RANGE3", "node-id": "XPONDER-3-2"}, "hard-constraints": {"exclude": {"node-id": ["OpenROADM-2-1", "OpenROADM-2-2"]}}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '473', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-pce:path-computation-request', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:398: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:236: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:211: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST' url = '/rests/operations/transportpce-pce:path-computation-request' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_17_success3_path_computation(self): self.path_computation_input_data["hard-constraints"] = {"exclude": {"node-id": ["OpenROADM-2-1", "OpenROADM-2-2"]}} > response = test_utils.transportpce_api_rpc_request('transportpce-pce', 'path-computation-request', self.path_computation_input_data) transportpce_tests/pce/test01_pce.py:295: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:684: in transportpce_api_rpc_request response = post_request(url, data) transportpce_tests/common/test_utils.py:142: in post_request return requests.request( ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError __ TransportPCEtesting.test_18_path_computation_before_oms_attribute_deletion __ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8181), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'POST' url = '/rests/operations/transportpce-pce:path-computation-request' body = '{"input": {"resource-reserve": "true", "service-handler-header": {"request-id": "request1"}, "pce-routing-metric": "h... "service-z-end": {"service-format": "Ethernet", "service-rate": "100", "clli": "ORANGE1", "node-id": "XPONDER-1-2"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '391', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-pce:path-computation-request', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:398: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:236: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:211: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST' url = '/rests/operations/transportpce-pce:path-computation-request' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_18_path_computation_before_oms_attribute_deletion(self): self.path_computation_input_data["service-a-end"]["node-id"] = "XPONDER-2-2" self.path_computation_input_data["service-a-end"]["clli"] = "ORANGE2" self.path_computation_input_data["service-z-end"]["node-id"] = "XPONDER-1-2" self.path_computation_input_data["service-z-end"]["clli"] = "ORANGE1" del self.path_computation_input_data["hard-constraints"] > response = test_utils.transportpce_api_rpc_request('transportpce-pce', 'path-computation-request', self.path_computation_input_data) transportpce_tests/pce/test01_pce.py:314: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:684: in transportpce_api_rpc_request response = post_request(url, data) transportpce_tests/common/test_utils.py:142: in post_request return requests.request( ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError _ TransportPCEtesting.test_19_delete_oms_attribute_in_openroadm13toopenroadm12_link _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8181), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'DELETE' url = '/rests/data/ietf-network:networks/network=openroadm-topology/ietf-network-topology:link=OpenROADM-1-3-DEG2-to-OpenROADM-1-2-DEG2/org-openroadm-network-topology:OMS-attributes/span' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '0', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/ietf-network:networks/network=openroadm-topology/i...penROADM-1-3-DEG2-to-OpenROADM-1-2-DEG2/org-openroadm-network-topology:OMS-attributes/span', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:398: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:236: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:211: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'DELETE' url = '/rests/data/ietf-network:networks/network=openroadm-topology/ietf-network-topology:link=OpenROADM-1-3-DEG2-to-OpenROADM-1-2-DEG2/org-openroadm-network-topology:OMS-attributes/span' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/ietf-network-topology:link=OpenROADM-1-3-DEG2-to-OpenROADM-1-2-DEG2/org-openroadm-network-topology:OMS-attributes/span (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_19_delete_oms_attribute_in_openroadm13toopenroadm12_link(self): > response = test_utils.del_oms_attr_request("OpenROADM-1-3-DEG2-to-OpenROADM-1-2-DEG2") transportpce_tests/pce/test01_pce.py:335: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:567: in del_oms_attr_request response = delete_request(url2.format('{}', network, link)) transportpce_tests/common/test_utils.py:133: in delete_request return requests.request( ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology/ietf-network-topology:link=OpenROADM-1-3-DEG2-to-OpenROADM-1-2-DEG2/org-openroadm-network-topology:OMS-attributes/span (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError __ TransportPCEtesting.test_20_path_computation_after_oms_attribute_deletion ___ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8181), timeout = 10, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'POST' url = '/rests/operations/transportpce-pce:path-computation-request' body = '{"input": {"resource-reserve": "true", "service-handler-header": {"request-id": "request1"}, "pce-routing-metric": "h... "service-z-end": {"service-format": "Ethernet", "service-rate": "100", "clli": "ORANGE1", "node-id": "XPONDER-1-2"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '391', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=10, read=10, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-pce:path-computation-request', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:495: in _make_request conn.request( ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:398: in request self.endheaders() /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1289: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:1048: in _send_output self.send(msg) /opt/pyenv/versions/3.11.7/lib/python3.11/http/client.py:986: in send self.connect() ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:236: in connect self.sock = self._new_conn() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:211: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:843: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST' url = '/rests/operations/transportpce-pce:path-computation-request' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: self = def test_20_path_computation_after_oms_attribute_deletion(self): > response = test_utils.transportpce_api_rpc_request('transportpce-pce', 'path-computation-request', self.path_computation_input_data) transportpce_tests/pce/test01_pce.py:341: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils.py:684: in transportpce_api_rpc_request response = post_request(url, data) transportpce_tests/common/test_utils.py:142: in post_request return requests.request( ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=10, read=10, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:700: ConnectionError =========================== short test summary info ============================ FAILED transportpce_tests/pce/test01_pce.py::TransportPCEtesting::test_01_load_port_mapping FAILED transportpce_tests/pce/test01_pce.py::TransportPCEtesting::test_02_load_simple_topology_bi FAILED transportpce_tests/pce/test01_pce.py::TransportPCEtesting::test_03_get_nodeId FAILED transportpce_tests/pce/test01_pce.py::TransportPCEtesting::test_04_get_linkId FAILED transportpce_tests/pce/test01_pce.py::TransportPCEtesting::test_05_path_computation_xpdr_bi FAILED transportpce_tests/pce/test01_pce.py::TransportPCEtesting::test_06_path_computation_rdm_bi FAILED transportpce_tests/pce/test01_pce.py::TransportPCEtesting::test_07_load_simple_topology_uni FAILED transportpce_tests/pce/test01_pce.py::TransportPCEtesting::test_08_get_nodeId FAILED transportpce_tests/pce/test01_pce.py::TransportPCEtesting::test_09_get_linkId FAILED transportpce_tests/pce/test01_pce.py::TransportPCEtesting::test_10_path_computation_xpdr_uni FAILED transportpce_tests/pce/test01_pce.py::TransportPCEtesting::test_11_path_computation_rdm_uni FAILED transportpce_tests/pce/test01_pce.py::TransportPCEtesting::test_12_load_complex_topology FAILED transportpce_tests/pce/test01_pce.py::TransportPCEtesting::test_13_get_nodeId FAILED transportpce_tests/pce/test01_pce.py::TransportPCEtesting::test_14_fail_path_computation FAILED transportpce_tests/pce/test01_pce.py::TransportPCEtesting::test_15_success1_path_computation FAILED transportpce_tests/pce/test01_pce.py::TransportPCEtesting::test_16_success2_path_computation FAILED transportpce_tests/pce/test01_pce.py::TransportPCEtesting::test_17_success3_path_computation FAILED transportpce_tests/pce/test01_pce.py::TransportPCEtesting::test_18_path_computation_before_oms_attribute_deletion FAILED transportpce_tests/pce/test01_pce.py::TransportPCEtesting::test_19_delete_oms_attribute_in_openroadm13toopenroadm12_link FAILED transportpce_tests/pce/test01_pce.py::TransportPCEtesting::test_20_path_computation_after_oms_attribute_deletion ERROR transportpce_tests/pce/test01_pce.py::TransportPCEtesting::test_20_path_computation_after_oms_attribute_deletion 20 failed, 1 error in 72.53s (0:01:12) build_karaf_tests_hybrid: OK ✔ in 1 minute 4.7 seconds testsPCE: exit 1 (72.87 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh pce pid=32177 testsPCE: FAIL ✖ in 3 minutes 13.88 seconds tests121: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt .....tests121: freeze> python -m pip freeze --all tests121: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.8,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.4.1,pip==24.1,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.2,requests==2.32.3,setuptools==70.1.0,urllib3==2.2.2,wheel==0.43.0 tests121: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 1.2.1 using environment variables from ./karaf121.env pytest -q transportpce_tests/1.2.1/test01_portmapping.py ............................................. [100%] 50 passed in 234.34s (0:03:54) pytest -q transportpce_tests/tapi/test02_full_topology.py ..................................... [100%] 21 passed in 265.12s (0:04:25) pytest -q transportpce_tests/1.2.1/test02_topo_portmapping.py .............. [100%] 30 passed in 267.48s (0:04:27) pytest -q transportpce_tests/tapi/test03_tapi_device_change_notifications.py ................... [100%] 6 passed in 222.30s (0:03:42) pytest -q transportpce_tests/1.2.1/test03_topology.py ......................................................... [100%] 70 passed in 264.68s (0:04:24) tests_tapi: OK ✔ in 13 minutes 3.21 seconds tests71: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt tests71: freeze> python -m pip freeze --all tests71: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.8,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.4.1,pip==24.1,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.2,requests==2.32.3,setuptools==70.1.0,urllib3==2.2.2,wheel==0.43.0 tests71: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 7.1 using environment variables from ./karaf71.env pytest -q transportpce_tests/7.1/test01_portmapping.py ............ [100%] 12 passed in 40.35s pytest -q transportpce_tests/7.1/test02_otn_renderer.py .......................................................................... [100%] 62 passed in 153.97s (0:02:33) pytest -q transportpce_tests/7.1/test03_renderer_or_modes.py ........................................................ [100%] 48 passed in 133.50s (0:02:13) pytest -q transportpce_tests/7.1/test04_renderer_regen_mode.py ...................... [100%] 22 passed in 70.74s (0:01:10) tests71: OK ✔ in 6 minutes 46.11 seconds tests221: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt tests221: freeze> python -m pip freeze --all tests221: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.8,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.4.1,pip==24.1,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.2,requests==2.32.3,setuptools==70.1.0,urllib3==2.2.2,wheel==0.43.0 tests221: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 2.2.1 using environment variables from ./karaf221.env pytest -q transportpce_tests/2.2.1/test01_portmapping.py ................................................. [100%] 35 passed in 69.82s (0:01:09) pytest -q transportpce_tests/2.2.1/test02_topo_portmapping.py .......... [100%] 44 passed in 664.29s (0:11:04) pytest -q transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py ...... [100%] 6 passed in 45.80s pytest -q transportpce_tests/2.2.1/test03_topology.py ............................................ [100%] 44 passed in 133.64s (0:02:13) pytest -q transportpce_tests/2.2.1/test04_otn_topology.py ............... [100%] 12 passed in 58.62s pytest -q transportpce_tests/2.2.1/test05_flex_grid.py ........................ [100%] 24 passed in 262.82s (0:04:22) .pytest -q transportpce_tests/1.2.1/test05_olm.py ............................ [100%] 16 passed in 114.42s (0:01:54) pytest -q transportpce_tests/2.2.1/test06_renderer_service_path_nominal.py ..................................... [100%] 31 passed in 33.03s pytest -q transportpce_tests/2.2.1/test07_otn_renderer.py ...................................... [100%] 40 passed in 176.81s (0:02:56) pytest -q transportpce_tests/1.2.1/test06_end2end.py ...... [100%] 26 passed in 89.70s (0:01:29) pytest -q transportpce_tests/2.2.1/test08_otn_sh_renderer.py ................................................. [100%] 22 passed in 109.90s (0:01:49) pytest -q transportpce_tests/2.2.1/test09_olm.py ............................................................. [100%] 40 passed in 177.39s (0:02:57) pytest -q transportpce_tests/2.2.1/test11_otn_end2end.py ...................................................... [100%] 54 passed in 541.80s (0:09:01) ........................ [ 74%] ......................... [100%] 97 passed in 667.12s (0:11:07) pytest -q transportpce_tests/2.2.1/test12_end2end.py FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF [100%] =================================== FAILURES =================================== ________________ TransportPCEFulltesting.test_01_connect_xpdrA _________________ self = def test_01_connect_xpdrA(self): response = test_utils.mount_device("XPDR-A1", ('xpdra', self.NODE_VERSION)) > self.assertEqual(response.status_code, requests.codes.created, test_utils.CODE_SHOULD_BE_201) E AssertionError: 401 != 201 : Http status code should be 201 transportpce_tests/2.2.1/test12_end2end.py:156: AssertionError ---------------------------- Captured stdout setup ----------------------------- starting OpenDaylight... starting KARAF TransportPCE build... Searching for pattern 'Transportpce controller started' in karaf.log... Pattern found! OpenDaylight started ! starting simulator xpdra in OpenROADM device version 2.2.1... Searching for pattern 'Data tree change listeners registered' in xpdra-221.log... Pattern found! simulator for xpdra started starting simulator roadma in OpenROADM device version 2.2.1... Searching for pattern 'Data tree change listeners registered' in roadma-221.log... Pattern found! simulator for roadma started starting simulator roadmc in OpenROADM device version 2.2.1... Searching for pattern 'Data tree change listeners registered' in roadmc-221.log... Pattern found! simulator for roadmc started starting simulator xpdrc in OpenROADM device version 2.2.1... Searching for pattern 'Data tree change listeners registered' in xpdrc-221.log... Pattern found! simulator for xpdrc started ----------------------------- Captured stdout call ----------------------------- execution of test_01_connect_xpdrA Searching for pattern 'Triggering notification stream NETCONF for node XPDR-A1' in karaf.log... Pattern not found after 180 seconds! Node XPDR-A1 still not added to tpce topology... ________________ TransportPCEFulltesting.test_02_connect_xpdrC _________________ self = def test_02_connect_xpdrC(self): response = test_utils.mount_device("XPDR-C1", ('xpdrc', self.NODE_VERSION)) > self.assertEqual(response.status_code, requests.codes.created, test_utils.CODE_SHOULD_BE_201) E AssertionError: 401 != 201 : Http status code should be 201 transportpce_tests/2.2.1/test12_end2end.py:160: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_02_connect_xpdrC Searching for pattern 'Triggering notification stream NETCONF for node XPDR-C1' in karaf.log... Pattern not found after 180 seconds! Node XPDR-C1 still not added to tpce topology... _________________ TransportPCEFulltesting.test_03_connect_rdmA _________________ self = def test_03_connect_rdmA(self): response = test_utils.mount_device("ROADM-A1", ('roadma', self.NODE_VERSION)) > self.assertEqual(response.status_code, requests.codes.created, test_utils.CODE_SHOULD_BE_201) E AssertionError: 401 != 201 : Http status code should be 201 transportpce_tests/2.2.1/test12_end2end.py:164: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_03_connect_rdmA Searching for pattern 'Triggering notification stream NETCONF for node ROADM-A1' in karaf.log... Pattern not found after 180 seconds! Node ROADM-A1 still not added to tpce topology... _________________ TransportPCEFulltesting.test_04_connect_rdmC _________________ self = def test_04_connect_rdmC(self): response = test_utils.mount_device("ROADM-C1", ('roadmc', self.NODE_VERSION)) > self.assertEqual(response.status_code, requests.codes.created, test_utils.CODE_SHOULD_BE_201) E AssertionError: 401 != 201 : Http status code should be 201 transportpce_tests/2.2.1/test12_end2end.py:168: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_04_connect_rdmC Searching for pattern 'Triggering notification stream NETCONF for node ROADM-C1' in karaf.log... Pattern not found after 180 seconds! Node ROADM-C1 still not added to tpce topology... ________ TransportPCEFulltesting.test_05_connect_xpdrA_N1_to_roadmA_PP1 ________ self = def test_05_connect_xpdrA_N1_to_roadmA_PP1(self): > response = test_utils.transportpce_api_rpc_request( 'transportpce-networkutils', 'init-xpdr-rdm-links', {'links-input': {'xpdr-node': 'XPDR-A1', 'xpdr-num': '1', 'network-num': '1', 'rdm-node': 'ROADM-A1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX'}}) transportpce_tests/2.2.1/test12_end2end.py:171: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'transportpce-networkutils', rpc = 'init-xpdr-rdm-links' payload = {'links-input': {'network-num': '1', 'rdm-node': 'ROADM-A1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX', ...}} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'transportpce-networkutils:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_05_connect_xpdrA_N1_to_roadmA_PP1 ________ TransportPCEFulltesting.test_06_connect_roadmA_PP1_to_xpdrA_N1 ________ self = def test_06_connect_roadmA_PP1_to_xpdrA_N1(self): > response = test_utils.transportpce_api_rpc_request( 'transportpce-networkutils', 'init-rdm-xpdr-links', {'links-input': {'xpdr-node': 'XPDR-A1', 'xpdr-num': '1', 'network-num': '1', 'rdm-node': 'ROADM-A1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX'}}) transportpce_tests/2.2.1/test12_end2end.py:180: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'transportpce-networkutils', rpc = 'init-rdm-xpdr-links' payload = {'links-input': {'network-num': '1', 'rdm-node': 'ROADM-A1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX', ...}} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'transportpce-networkutils:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_06_connect_roadmA_PP1_to_xpdrA_N1 _____ TransportPCEFulltesting.test_07_connect_xpdrC_xpdr1_N1_to_roadmC_PP1 _____ self = def test_07_connect_xpdrC_xpdr1_N1_to_roadmC_PP1(self): > response = test_utils.transportpce_api_rpc_request( 'transportpce-networkutils', 'init-xpdr-rdm-links', {'links-input': {'xpdr-node': 'XPDR-C1', 'xpdr-num': '1', 'network-num': '1', 'rdm-node': 'ROADM-C1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX'}}) transportpce_tests/2.2.1/test12_end2end.py:189: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'transportpce-networkutils', rpc = 'init-xpdr-rdm-links' payload = {'links-input': {'network-num': '1', 'rdm-node': 'ROADM-C1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX', ...}} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'transportpce-networkutils:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_07_connect_xpdrC_xpdr1_N1_to_roadmC_PP1 _____ TransportPCEFulltesting.test_08_connect_roadmC_PP1_to_xpdrC_xpdr1_N1 _____ self = def test_08_connect_roadmC_PP1_to_xpdrC_xpdr1_N1(self): > response = test_utils.transportpce_api_rpc_request( 'transportpce-networkutils', 'init-rdm-xpdr-links', {'links-input': {'xpdr-node': 'XPDR-C1', 'xpdr-num': '1', 'network-num': '1', 'rdm-node': 'ROADM-C1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX'}}) transportpce_tests/2.2.1/test12_end2end.py:198: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'transportpce-networkutils', rpc = 'init-rdm-xpdr-links' payload = {'links-input': {'network-num': '1', 'rdm-node': 'ROADM-C1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX', ...}} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'transportpce-networkutils:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_08_connect_roadmC_PP1_to_xpdrC_xpdr1_N1 ________ TransportPCEFulltesting.test_09_connect_xpdrA_N2_to_roadmA_PP2 ________ self = def test_09_connect_xpdrA_N2_to_roadmA_PP2(self): > response = test_utils.transportpce_api_rpc_request( 'transportpce-networkutils', 'init-xpdr-rdm-links', {'links-input': {'xpdr-node': 'XPDR-A1', 'xpdr-num': '1', 'network-num': '2', 'rdm-node': 'ROADM-A1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP2-TXRX'}}) transportpce_tests/2.2.1/test12_end2end.py:207: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'transportpce-networkutils', rpc = 'init-xpdr-rdm-links' payload = {'links-input': {'network-num': '2', 'rdm-node': 'ROADM-A1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP2-TXRX', ...}} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'transportpce-networkutils:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_09_connect_xpdrA_N2_to_roadmA_PP2 ________ TransportPCEFulltesting.test_10_connect_roadmA_PP2_to_xpdrA_N2 ________ self = def test_10_connect_roadmA_PP2_to_xpdrA_N2(self): > response = test_utils.transportpce_api_rpc_request( 'transportpce-networkutils', 'init-rdm-xpdr-links', {'links-input': {'xpdr-node': 'XPDR-A1', 'xpdr-num': '1', 'network-num': '2', 'rdm-node': 'ROADM-A1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP2-TXRX'}}) transportpce_tests/2.2.1/test12_end2end.py:216: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'transportpce-networkutils', rpc = 'init-rdm-xpdr-links' payload = {'links-input': {'network-num': '2', 'rdm-node': 'ROADM-A1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP2-TXRX', ...}} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'transportpce-networkutils:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_10_connect_roadmA_PP2_to_xpdrA_N2 _____ TransportPCEFulltesting.test_11_connect_xpdrC_xpdr2_N1_to_roadmC_PP2 _____ self = def test_11_connect_xpdrC_xpdr2_N1_to_roadmC_PP2(self): > response = test_utils.transportpce_api_rpc_request( 'transportpce-networkutils', 'init-xpdr-rdm-links', {'links-input': {'xpdr-node': 'XPDR-C1', 'xpdr-num': '2', 'network-num': '1', 'rdm-node': 'ROADM-C1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP2-TXRX'}}) transportpce_tests/2.2.1/test12_end2end.py:225: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'transportpce-networkutils', rpc = 'init-xpdr-rdm-links' payload = {'links-input': {'network-num': '1', 'rdm-node': 'ROADM-C1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP2-TXRX', ...}} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'transportpce-networkutils:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_11_connect_xpdrC_xpdr2_N1_to_roadmC_PP2 _____ TransportPCEFulltesting.test_12_connect_roadmC_PP2_to_xpdrC_xpdr2_N1 _____ self = def test_12_connect_roadmC_PP2_to_xpdrC_xpdr2_N1(self): > response = test_utils.transportpce_api_rpc_request( 'transportpce-networkutils', 'init-rdm-xpdr-links', {'links-input': {'xpdr-node': 'XPDR-C1', 'xpdr-num': '2', 'network-num': '1', 'rdm-node': 'ROADM-C1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP2-TXRX'}}) transportpce_tests/2.2.1/test12_end2end.py:234: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'transportpce-networkutils', rpc = 'init-rdm-xpdr-links' payload = {'links-input': {'network-num': '1', 'rdm-node': 'ROADM-C1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP2-TXRX', ...}} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'transportpce-networkutils:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_12_connect_roadmC_PP2_to_xpdrC_xpdr2_N1 _______ TransportPCEFulltesting.test_13_add_omsAttributes_ROADMA_ROADMC ________ self = def test_13_add_omsAttributes_ROADMA_ROADMC(self): # Config ROADMA-ROADMC oms-attributes data = {"span": { "auto-spanloss": "true", "spanloss-base": 11.4, "spanloss-current": 12, "engineered-spanloss": 12.2, "link-concatenation": [{ "SRLG-Id": 0, "fiber-type": "smf", "SRLG-length": 100000, "pmd": 0.5}]}} response = test_utils.add_oms_attr_request("ROADM-A1-DEG2-DEG2-TTP-TXRXtoROADM-C1-DEG1-DEG1-TTP-TXRX", data) > self.assertEqual(response.status_code, requests.codes.created) E AssertionError: 401 != 201 transportpce_tests/2.2.1/test12_end2end.py:256: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_13_add_omsAttributes_ROADMA_ROADMC _______ TransportPCEFulltesting.test_14_add_omsAttributes_ROADMC_ROADMA ________ self = def test_14_add_omsAttributes_ROADMC_ROADMA(self): # Config ROADMC-ROADMA oms-attributes data = {"span": { "auto-spanloss": "true", "spanloss-base": 11.4, "spanloss-current": 12, "engineered-spanloss": 12.2, "link-concatenation": [{ "SRLG-Id": 0, "fiber-type": "smf", "SRLG-length": 100000, "pmd": 0.5}]}} response = test_utils.add_oms_attr_request("ROADM-C1-DEG1-DEG1-TTP-TXRXtoROADM-A1-DEG2-DEG2-TTP-TXRX", data) > self.assertEqual(response.status_code, requests.codes.created) E AssertionError: 401 != 201 transportpce_tests/2.2.1/test12_end2end.py:272: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_14_add_omsAttributes_ROADMC_ROADMA _____________ TransportPCEFulltesting.test_15_create_eth_service2 ______________ self = def test_15_create_eth_service2(self): self.cr_serv_input_data["service-name"] = "service2" > response = test_utils.transportpce_api_rpc_request( 'org-openroadm-service', 'service-create', self.cr_serv_input_data) transportpce_tests/2.2.1/test12_end2end.py:277: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'org-openroadm-service', rpc = 'service-create' payload = {'common-id': 'ASATT1234567', 'connection-type': 'service', 'due-date': '2016-11-28T00:00:01Z', 'operator-contact': 'pw1234', ...} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'org-openroadm-service:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_15_create_eth_service2 _______________ TransportPCEFulltesting.test_16_get_eth_service2 _______________ self = def test_16_get_eth_service2(self): > response = test_utils.get_ordm_serv_list_attr_request("services", "service2") transportpce_tests/2.2.1/test12_end2end.py:286: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attribute = 'services', value = 'service2' def get_ordm_serv_list_attr_request(attribute: str, value: str): url = {'rfc8040': '{}/data/org-openroadm-service:service-list/{}={}?content=nonconfig', 'draft-bierman02': '{}/operational/org-openroadm-service:service-list/{}/{}'} format_args = ('{}', attribute, value) response = get_request(url[RESTCONF_VERSION].format(*format_args)) res = response.json() return_key = {'rfc8040': 'org-openroadm-service:' + attribute, 'draft-bierman02': attribute} if return_key[RESTCONF_VERSION] in res.keys(): response_attribute = res[return_key[RESTCONF_VERSION]] else: > response_attribute = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:636: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_16_get_eth_service2 _______________ TransportPCEFulltesting.test_17_check_xc1_ROADMA _______________ self = def test_17_check_xc1_ROADMA(self): response = test_utils.check_node_attribute_request( "ROADM-A1", "roadm-connections", "SRG1-PP1-TXRX-DEG2-TTP-TXRX-761:768") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/2.2.1/test12_end2end.py:297: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_17_check_xc1_ROADMA Unauthorized /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADM-A1/yang-ext:mount/org-openroadm-device:org-openroadm-device/roadm-connections=SRG1-PP1-TXRX-DEG2-TTP-TXRX-761:768 _______________ TransportPCEFulltesting.test_18_check_xc1_ROADMC _______________ self = def test_18_check_xc1_ROADMC(self): response = test_utils.check_node_attribute_request( "ROADM-C1", "roadm-connections", "SRG1-PP2-TXRX-DEG1-TTP-TXRX-761:768") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/2.2.1/test12_end2end.py:312: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_18_check_xc1_ROADMC Unauthorized /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADM-C1/yang-ext:mount/org-openroadm-device:org-openroadm-device/roadm-connections=SRG1-PP2-TXRX-DEG1-TTP-TXRX-761:768 _______________ TransportPCEFulltesting.test_19_check_topo_XPDRA _______________ self = def test_19_check_topo_XPDRA(self): response = test_utils.get_ietf_network_node_request('openroadm-topology', 'XPDR-A1-XPDR1', 'config') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/2.2.1/test12_end2end.py:326: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_19_check_topo_XPDRA ____________ TransportPCEFulltesting.test_20_check_topo_ROADMA_SRG1 ____________ self = def test_20_check_topo_ROADMA_SRG1(self): response = test_utils.get_ietf_network_node_request('openroadm-topology', 'ROADM-A1-SRG1', 'config') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/2.2.1/test12_end2end.py:344: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_20_check_topo_ROADMA_SRG1 ____________ TransportPCEFulltesting.test_21_check_topo_ROADMA_DEG2 ____________ self = def test_21_check_topo_ROADMA_DEG2(self): response = test_utils.get_ietf_network_node_request('openroadm-topology', 'ROADM-A1-DEG2', 'config') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/2.2.1/test12_end2end.py:362: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_21_check_topo_ROADMA_DEG2 _____________ TransportPCEFulltesting.test_22_create_eth_service1 ______________ self = def test_22_create_eth_service1(self): self.cr_serv_input_data["service-name"] = "service1" del self.cr_serv_input_data["service-a-end"]["tx-direction"][0]["port"]["port-device-name"] del self.cr_serv_input_data["service-a-end"]["tx-direction"][0]["port"]["port-name"] del self.cr_serv_input_data["service-a-end"]["rx-direction"][0]["port"]["port-device-name"] del self.cr_serv_input_data["service-a-end"]["rx-direction"][0]["port"]["port-name"] del self.cr_serv_input_data["service-z-end"]["tx-direction"][0]["port"]["port-device-name"] del self.cr_serv_input_data["service-z-end"]["tx-direction"][0]["port"]["port-name"] del self.cr_serv_input_data["service-z-end"]["rx-direction"][0]["port"]["port-device-name"] del self.cr_serv_input_data["service-z-end"]["rx-direction"][0]["port"]["port-name"] > response = test_utils.transportpce_api_rpc_request( 'org-openroadm-service', 'service-create', self.cr_serv_input_data) transportpce_tests/2.2.1/test12_end2end.py:391: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'org-openroadm-service', rpc = 'service-create' payload = {'common-id': 'ASATT1234567', 'connection-type': 'service', 'due-date': '2016-11-28T00:00:01Z', 'operator-contact': 'pw1234', ...} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'org-openroadm-service:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_22_create_eth_service1 _______________ TransportPCEFulltesting.test_23_get_eth_service1 _______________ self = def test_23_get_eth_service1(self): > response = test_utils.get_ordm_serv_list_attr_request("services", "service1") transportpce_tests/2.2.1/test12_end2end.py:400: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attribute = 'services', value = 'service1' def get_ordm_serv_list_attr_request(attribute: str, value: str): url = {'rfc8040': '{}/data/org-openroadm-service:service-list/{}={}?content=nonconfig', 'draft-bierman02': '{}/operational/org-openroadm-service:service-list/{}/{}'} format_args = ('{}', attribute, value) response = get_request(url[RESTCONF_VERSION].format(*format_args)) res = response.json() return_key = {'rfc8040': 'org-openroadm-service:' + attribute, 'draft-bierman02': attribute} if return_key[RESTCONF_VERSION] in res.keys(): response_attribute = res[return_key[RESTCONF_VERSION]] else: > response_attribute = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:636: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_23_get_eth_service1 _______________ TransportPCEFulltesting.test_24_check_xc1_ROADMA _______________ self = def test_24_check_xc1_ROADMA(self): response = test_utils.check_node_attribute_request( "ROADM-A1", "roadm-connections", "DEG2-TTP-TXRX-SRG1-PP2-TXRX-753:760") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/2.2.1/test12_end2end.py:411: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_24_check_xc1_ROADMA Unauthorized /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADM-A1/yang-ext:mount/org-openroadm-device:org-openroadm-device/roadm-connections=DEG2-TTP-TXRX-SRG1-PP2-TXRX-753:760 _______________ TransportPCEFulltesting.test_25_check_topo_XPDRA _______________ self = def test_25_check_topo_XPDRA(self): response = test_utils.get_ietf_network_node_request('openroadm-topology', 'XPDR-A1-XPDR1', 'config') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/2.2.1/test12_end2end.py:424: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_25_check_topo_XPDRA ____________ TransportPCEFulltesting.test_26_check_topo_ROADMA_SRG1 ____________ self = def test_26_check_topo_ROADMA_SRG1(self): response = test_utils.get_ietf_network_node_request('openroadm-topology', 'ROADM-A1-SRG1', 'config') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/2.2.1/test12_end2end.py:440: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_26_check_topo_ROADMA_SRG1 ____________ TransportPCEFulltesting.test_27_check_topo_ROADMA_DEG2 ____________ self = def test_27_check_topo_ROADMA_DEG2(self): response = test_utils.get_ietf_network_node_request('openroadm-topology', 'ROADM-A1-DEG2', 'config') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/2.2.1/test12_end2end.py:466: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_27_check_topo_ROADMA_DEG2 _____________ TransportPCEFulltesting.test_28_create_eth_service3 ______________ self = def test_28_create_eth_service3(self): self.cr_serv_input_data["service-name"] = "service3" > response = test_utils.transportpce_api_rpc_request( 'org-openroadm-service', 'service-create', self.cr_serv_input_data) transportpce_tests/2.2.1/test12_end2end.py:491: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'org-openroadm-service', rpc = 'service-create' payload = {'common-id': 'ASATT1234567', 'connection-type': 'service', 'due-date': '2016-11-28T00:00:01Z', 'operator-contact': 'pw1234', ...} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'org-openroadm-service:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_28_create_eth_service3 _____________ TransportPCEFulltesting.test_29_delete_eth_service3 ______________ self = def test_29_delete_eth_service3(self): self.del_serv_input_data["service-delete-req-info"]["service-name"] = "service3" > response = test_utils.transportpce_api_rpc_request( 'org-openroadm-service', 'service-delete', self.del_serv_input_data) transportpce_tests/2.2.1/test12_end2end.py:503: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'org-openroadm-service', rpc = 'service-delete' payload = {'sdnc-request-header': {'notification-url': 'http://localhost:8585/NotificationServer/notify', 'request-id': 'e3028ba...ame', 'rpc-action': 'service-delete'}, 'service-delete-req-info': {'service-name': 'service3', 'tail-retention': 'no'}} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'org-openroadm-service:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_29_delete_eth_service3 _____________ TransportPCEFulltesting.test_30_delete_eth_service1 ______________ self = def test_30_delete_eth_service1(self): self.del_serv_input_data["service-delete-req-info"]["service-name"] = "service1" > response = test_utils.transportpce_api_rpc_request( 'org-openroadm-service', 'service-delete', self.del_serv_input_data) transportpce_tests/2.2.1/test12_end2end.py:514: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'org-openroadm-service', rpc = 'service-delete' payload = {'sdnc-request-header': {'notification-url': 'http://localhost:8585/NotificationServer/notify', 'request-id': 'e3028ba...ame', 'rpc-action': 'service-delete'}, 'service-delete-req-info': {'service-name': 'service1', 'tail-retention': 'no'}} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'org-openroadm-service:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_30_delete_eth_service1 _____________ TransportPCEFulltesting.test_31_delete_eth_service2 ______________ self = def test_31_delete_eth_service2(self): self.del_serv_input_data["service-delete-req-info"]["service-name"] = "service2" > response = test_utils.transportpce_api_rpc_request( 'org-openroadm-service', 'service-delete', self.del_serv_input_data) transportpce_tests/2.2.1/test12_end2end.py:524: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'org-openroadm-service', rpc = 'service-delete' payload = {'sdnc-request-header': {'notification-url': 'http://localhost:8585/NotificationServer/notify', 'request-id': 'e3028ba...ame', 'rpc-action': 'service-delete'}, 'service-delete-req-info': {'service-name': 'service2', 'tail-retention': 'no'}} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'org-openroadm-service:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_31_delete_eth_service2 ______________ TransportPCEFulltesting.test_32_check_no_xc_ROADMA ______________ self = def test_32_check_no_xc_ROADMA(self): > response = test_utils.check_node_request("ROADM-A1") transportpce_tests/2.2.1/test12_end2end.py:533: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ node = 'ROADM-A1' def check_node_request(node: str): # pylint: disable=line-too-long url = {'rfc8040': '{}/data/network-topology:network-topology/topology=topology-netconf/node={}/yang-ext:mount/org-openroadm-device:org-openroadm-device?content=config', # nopep8 'draft-bierman02': '{}/config/network-topology:network-topology/topology/topology-netconf/node/{}/yang-ext:mount/org-openroadm-device:org-openroadm-device'} # nopep8 response = get_request(url[RESTCONF_VERSION].format('{}', node)) res = response.json() return_key = {'rfc8040': 'org-openroadm-device:org-openroadm-device', 'draft-bierman02': 'org-openroadm-device'} if return_key[RESTCONF_VERSION] in res.keys(): response_attribute = res[return_key[RESTCONF_VERSION]] else: > response_attribute = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:392: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_32_check_no_xc_ROADMA _______________ TransportPCEFulltesting.test_33_check_topo_XPDRA _______________ self = def test_33_check_topo_XPDRA(self): response = test_utils.get_ietf_network_node_request('openroadm-topology', 'XPDR-A1-XPDR1', 'config') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/2.2.1/test12_end2end.py:541: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_33_check_topo_XPDRA ____________ TransportPCEFulltesting.test_34_check_topo_ROADMA_SRG1 ____________ self = def test_34_check_topo_ROADMA_SRG1(self): response = test_utils.get_ietf_network_node_request('openroadm-topology', 'ROADM-A1-SRG1', 'config') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/2.2.1/test12_end2end.py:555: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_34_check_topo_ROADMA_SRG1 ____________ TransportPCEFulltesting.test_35_check_topo_ROADMA_DEG2 ____________ self = def test_35_check_topo_ROADMA_DEG2(self): response = test_utils.get_ietf_network_node_request('openroadm-topology', 'ROADM-A1-DEG2', 'config') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/2.2.1/test12_end2end.py:581: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_35_check_topo_ROADMA_DEG2 ______________ TransportPCEFulltesting.test_36_create_oc_service1 ______________ self = def test_36_create_oc_service1(self): self.cr_serv_input_data["service-name"] = "service1" self.cr_serv_input_data["connection-type"] = "roadm-line" self.cr_serv_input_data["service-a-end"]["node-id"] = "ROADM-A1" self.cr_serv_input_data["service-a-end"]["service-format"] = "OC" self.cr_serv_input_data["service-z-end"]["node-id"] = "ROADM-C1" self.cr_serv_input_data["service-z-end"]["service-format"] = "OC" > response = test_utils.transportpce_api_rpc_request( 'org-openroadm-service', 'service-create', self.cr_serv_input_data) transportpce_tests/2.2.1/test12_end2end.py:611: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'org-openroadm-service', rpc = 'service-create' payload = {'common-id': 'ASATT1234567', 'connection-type': 'roadm-line', 'due-date': '2016-11-28T00:00:01Z', 'operator-contact': 'pw1234', ...} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'org-openroadm-service:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_36_create_oc_service1 _______________ TransportPCEFulltesting.test_37_get_oc_service1 ________________ self = def test_37_get_oc_service1(self): > response = test_utils.get_ordm_serv_list_attr_request("services", "service1") transportpce_tests/2.2.1/test12_end2end.py:620: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attribute = 'services', value = 'service1' def get_ordm_serv_list_attr_request(attribute: str, value: str): url = {'rfc8040': '{}/data/org-openroadm-service:service-list/{}={}?content=nonconfig', 'draft-bierman02': '{}/operational/org-openroadm-service:service-list/{}/{}'} format_args = ('{}', attribute, value) response = get_request(url[RESTCONF_VERSION].format(*format_args)) res = response.json() return_key = {'rfc8040': 'org-openroadm-service:' + attribute, 'draft-bierman02': attribute} if return_key[RESTCONF_VERSION] in res.keys(): response_attribute = res[return_key[RESTCONF_VERSION]] else: > response_attribute = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:636: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_37_get_oc_service1 _______________ TransportPCEFulltesting.test_38_check_xc1_ROADMA _______________ self = def test_38_check_xc1_ROADMA(self): response = test_utils.check_node_attribute_request( "ROADM-A1", "roadm-connections", "SRG1-PP1-TXRX-DEG2-TTP-TXRX-761:768") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/2.2.1/test12_end2end.py:631: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_38_check_xc1_ROADMA Unauthorized /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADM-A1/yang-ext:mount/org-openroadm-device:org-openroadm-device/roadm-connections=SRG1-PP1-TXRX-DEG2-TTP-TXRX-761:768 _______________ TransportPCEFulltesting.test_39_check_xc1_ROADMC _______________ self = def test_39_check_xc1_ROADMC(self): response = test_utils.check_node_attribute_request( "ROADM-C1", "roadm-connections", "SRG1-PP1-TXRX-DEG1-TTP-TXRX-761:768") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/2.2.1/test12_end2end.py:646: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_39_check_xc1_ROADMC Unauthorized /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADM-C1/yang-ext:mount/org-openroadm-device:org-openroadm-device/roadm-connections=SRG1-PP1-TXRX-DEG1-TTP-TXRX-761:768 ______________ TransportPCEFulltesting.test_40_create_oc_service2 ______________ self = def test_40_create_oc_service2(self): self.cr_serv_input_data["service-name"] = "service2" self.cr_serv_input_data["connection-type"] = "roadm-line" self.cr_serv_input_data["service-a-end"]["node-id"] = "ROADM-A1" self.cr_serv_input_data["service-a-end"]["service-format"] = "OC" self.cr_serv_input_data["service-z-end"]["node-id"] = "ROADM-C1" self.cr_serv_input_data["service-z-end"]["service-format"] = "OC" > response = test_utils.transportpce_api_rpc_request( 'org-openroadm-service', 'service-create', self.cr_serv_input_data) transportpce_tests/2.2.1/test12_end2end.py:665: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'org-openroadm-service', rpc = 'service-create' payload = {'common-id': 'ASATT1234567', 'connection-type': 'roadm-line', 'due-date': '2016-11-28T00:00:01Z', 'operator-contact': 'pw1234', ...} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'org-openroadm-service:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_40_create_oc_service2 _______________ TransportPCEFulltesting.test_41_get_oc_service2 ________________ self = def test_41_get_oc_service2(self): > response = test_utils.get_ordm_serv_list_attr_request("services", "service2") transportpce_tests/2.2.1/test12_end2end.py:674: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attribute = 'services', value = 'service2' def get_ordm_serv_list_attr_request(attribute: str, value: str): url = {'rfc8040': '{}/data/org-openroadm-service:service-list/{}={}?content=nonconfig', 'draft-bierman02': '{}/operational/org-openroadm-service:service-list/{}/{}'} format_args = ('{}', attribute, value) response = get_request(url[RESTCONF_VERSION].format(*format_args)) res = response.json() return_key = {'rfc8040': 'org-openroadm-service:' + attribute, 'draft-bierman02': attribute} if return_key[RESTCONF_VERSION] in res.keys(): response_attribute = res[return_key[RESTCONF_VERSION]] else: > response_attribute = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:636: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_41_get_oc_service2 _______________ TransportPCEFulltesting.test_42_check_xc2_ROADMA _______________ self = def test_42_check_xc2_ROADMA(self): response = test_utils.check_node_attribute_request( "ROADM-A1", "roadm-connections", "SRG1-PP2-TXRX-DEG2-TTP-TXRX-753:760") > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/2.2.1/test12_end2end.py:685: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_42_check_xc2_ROADMA Unauthorized /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADM-A1/yang-ext:mount/org-openroadm-device:org-openroadm-device/roadm-connections=SRG1-PP2-TXRX-DEG2-TTP-TXRX-753:760 ______________ TransportPCEFulltesting.test_43_check_topo_ROADMA _______________ self = def test_43_check_topo_ROADMA(self): response = test_utils.get_ietf_network_node_request('openroadm-topology', 'ROADM-A1-SRG1', 'config') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/2.2.1/test12_end2end.py:699: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_43_check_topo_ROADMA ______________ TransportPCEFulltesting.test_44_delete_oc_service1 ______________ self = def test_44_delete_oc_service1(self): self.del_serv_input_data["service-delete-req-info"]["service-name"] = "service1" > response = test_utils.transportpce_api_rpc_request( 'org-openroadm-service', 'service-delete', self.del_serv_input_data) transportpce_tests/2.2.1/test12_end2end.py:727: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'org-openroadm-service', rpc = 'service-delete' payload = {'sdnc-request-header': {'notification-url': 'http://localhost:8585/NotificationServer/notify', 'request-id': 'e3028ba...ame', 'rpc-action': 'service-delete'}, 'service-delete-req-info': {'service-name': 'service1', 'tail-retention': 'no'}} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'org-openroadm-service:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_44_delete_oc_service1 ______________ TransportPCEFulltesting.test_45_delete_oc_service2 ______________ self = def test_45_delete_oc_service2(self): self.del_serv_input_data["service-delete-req-info"]["service-name"] = "service2" > response = test_utils.transportpce_api_rpc_request( 'org-openroadm-service', 'service-delete', self.del_serv_input_data) transportpce_tests/2.2.1/test12_end2end.py:737: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'org-openroadm-service', rpc = 'service-delete' payload = {'sdnc-request-header': {'notification-url': 'http://localhost:8585/NotificationServer/notify', 'request-id': 'e3028ba...ame', 'rpc-action': 'service-delete'}, 'service-delete-req-info': {'service-name': 'service2', 'tail-retention': 'no'}} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'org-openroadm-service:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_45_delete_oc_service2 ______________ TransportPCEFulltesting.test_46_get_no_oc_services ______________ self = def test_46_get_no_oc_services(self): > response = test_utils.get_ordm_serv_list_request() transportpce_tests/2.2.1/test12_end2end.py:746: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def get_ordm_serv_list_request(): url = {'rfc8040': '{}/data/org-openroadm-service:service-list?content=nonconfig', 'draft-bierman02': '{}/operational/org-openroadm-service:service-list/'} response = get_request(url[RESTCONF_VERSION]) res = response.json() return_key = {'rfc8040': 'org-openroadm-service:service-list', 'draft-bierman02': 'service-list'} if return_key[RESTCONF_VERSION] in res.keys(): response_attribute = res[return_key[RESTCONF_VERSION]] else: > response_attribute = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:620: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_46_get_no_oc_services _______________ TransportPCEFulltesting.test_47_get_no_xc_ROADMA _______________ self = def test_47_get_no_xc_ROADMA(self): > response = test_utils.check_node_request("ROADM-A1") transportpce_tests/2.2.1/test12_end2end.py:765: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ node = 'ROADM-A1' def check_node_request(node: str): # pylint: disable=line-too-long url = {'rfc8040': '{}/data/network-topology:network-topology/topology=topology-netconf/node={}/yang-ext:mount/org-openroadm-device:org-openroadm-device?content=config', # nopep8 'draft-bierman02': '{}/config/network-topology:network-topology/topology/topology-netconf/node/{}/yang-ext:mount/org-openroadm-device:org-openroadm-device'} # nopep8 response = get_request(url[RESTCONF_VERSION].format('{}', node)) res = response.json() return_key = {'rfc8040': 'org-openroadm-device:org-openroadm-device', 'draft-bierman02': 'org-openroadm-device'} if return_key[RESTCONF_VERSION] in res.keys(): response_attribute = res[return_key[RESTCONF_VERSION]] else: > response_attribute = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:392: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_47_get_no_xc_ROADMA ______________ TransportPCEFulltesting.test_48_check_topo_ROADMA _______________ self = def test_48_check_topo_ROADMA(self): > self.test_34_check_topo_ROADMA_SRG1() transportpce_tests/2.2.1/test12_end2end.py:771: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/2.2.1/test12_end2end.py:555: in test_34_check_topo_ROADMA_SRG1 self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 ----------------------------- Captured stdout call ----------------------------- execution of test_48_check_topo_ROADMA ____________ TransportPCEFulltesting.test_49_loop_create_oc_service ____________ self = def test_49_loop_create_oc_service(self): for i in range(1, 3): # pylint: disable=consider-using-f-string print("iteration number {}".format(i)) print("oc service creation") > self.test_36_create_oc_service1() transportpce_tests/2.2.1/test12_end2end.py:779: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/2.2.1/test12_end2end.py:611: in test_36_create_oc_service1 response = test_utils.transportpce_api_rpc_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'org-openroadm-service', rpc = 'service-create' payload = {'common-id': 'ASATT1234567', 'connection-type': 'roadm-line', 'due-date': '2016-11-28T00:00:01Z', 'operator-contact': 'pw1234', ...} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'org-openroadm-service:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_49_loop_create_oc_service iteration number 1 oc service creation ___________ TransportPCEFulltesting.test_50_loop_create_eth_service ____________ self = def test_50_loop_create_eth_service(self): > response = test_utils.get_ordm_serv_list_attr_request("services", "service1") transportpce_tests/2.2.1/test12_end2end.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attribute = 'services', value = 'service1' def get_ordm_serv_list_attr_request(attribute: str, value: str): url = {'rfc8040': '{}/data/org-openroadm-service:service-list/{}={}?content=nonconfig', 'draft-bierman02': '{}/operational/org-openroadm-service:service-list/{}/{}'} format_args = ('{}', attribute, value) response = get_request(url[RESTCONF_VERSION].format(*format_args)) res = response.json() return_key = {'rfc8040': 'org-openroadm-service:' + attribute, 'draft-bierman02': attribute} if return_key[RESTCONF_VERSION] in res.keys(): response_attribute = res[return_key[RESTCONF_VERSION]] else: > response_attribute = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:636: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_50_loop_create_eth_service _______________ TransportPCEFulltesting.test_51_disconnect_XPDRA _______________ self = def test_51_disconnect_XPDRA(self): response = test_utils.unmount_device("XPDR-A1") > self.assertIn(response.status_code, (requests.codes.ok, requests.codes.no_content)) E AssertionError: 401 not found in (200, 204) transportpce_tests/2.2.1/test12_end2end.py:812: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_51_disconnect_XPDRA Searching for pattern 'onDeviceDisConnected:\ XPDR\-A1' in karaf.log... Pattern not found after 180 seconds! Node XPDR-A1 still not deleted from tpce topology... _______________ TransportPCEFulltesting.test_52_disconnect_XPDRC _______________ self = def test_52_disconnect_XPDRC(self): response = test_utils.unmount_device("XPDR-C1") > self.assertIn(response.status_code, (requests.codes.ok, requests.codes.no_content)) E AssertionError: 401 not found in (200, 204) transportpce_tests/2.2.1/test12_end2end.py:816: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_52_disconnect_XPDRC Searching for pattern 'onDeviceDisConnected:\ XPDR\-C1' in karaf.log... Pattern not found after 180 seconds! Node XPDR-C1 still not deleted from tpce topology... ______________ TransportPCEFulltesting.test_53_disconnect_ROADMA _______________ self = def test_53_disconnect_ROADMA(self): response = test_utils.unmount_device("ROADM-A1") > self.assertIn(response.status_code, (requests.codes.ok, requests.codes.no_content)) E AssertionError: 401 not found in (200, 204) transportpce_tests/2.2.1/test12_end2end.py:820: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_53_disconnect_ROADMA Searching for pattern 'onDeviceDisConnected:\ ROADM\-A1' in karaf.log... Pattern not found after 180 seconds! Node ROADM-A1 still not deleted from tpce topology... ______________ TransportPCEFulltesting.test_54_disconnect_ROADMC _______________ self = def test_54_disconnect_ROADMC(self): response = test_utils.unmount_device("ROADM-C1") > self.assertIn(response.status_code, (requests.codes.ok, requests.codes.no_content)) E AssertionError: 401 not found in (200, 204) transportpce_tests/2.2.1/test12_end2end.py:824: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_54_disconnect_ROADMC Searching for pattern 'onDeviceDisConnected:\ ROADM\-C1' in karaf.log... Pattern not found after 180 seconds! Node ROADM-C1 still not deleted from tpce topology... --------------------------- Captured stdout teardown --------------------------- all processes killed =========================== short test summary info ============================ FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_01_connect_xpdrA FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_02_connect_xpdrC FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_03_connect_rdmA FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_04_connect_rdmC FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_05_connect_xpdrA_N1_to_roadmA_PP1 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_06_connect_roadmA_PP1_to_xpdrA_N1 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_07_connect_xpdrC_xpdr1_N1_to_roadmC_PP1 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_08_connect_roadmC_PP1_to_xpdrC_xpdr1_N1 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_09_connect_xpdrA_N2_to_roadmA_PP2 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_10_connect_roadmA_PP2_to_xpdrA_N2 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_11_connect_xpdrC_xpdr2_N1_to_roadmC_PP2 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_12_connect_roadmC_PP2_to_xpdrC_xpdr2_N1 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_13_add_omsAttributes_ROADMA_ROADMC FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_14_add_omsAttributes_ROADMC_ROADMA FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_15_create_eth_service2 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_16_get_eth_service2 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_17_check_xc1_ROADMA FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_18_check_xc1_ROADMC FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_19_check_topo_XPDRA FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_20_check_topo_ROADMA_SRG1 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_21_check_topo_ROADMA_DEG2 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_22_create_eth_service1 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_23_get_eth_service1 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_24_check_xc1_ROADMA FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_25_check_topo_XPDRA FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_26_check_topo_ROADMA_SRG1 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_27_check_topo_ROADMA_DEG2 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_28_create_eth_service3 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_29_delete_eth_service3 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_30_delete_eth_service1 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_31_delete_eth_service2 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_32_check_no_xc_ROADMA FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_33_check_topo_XPDRA FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_34_check_topo_ROADMA_SRG1 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_35_check_topo_ROADMA_DEG2 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_36_create_oc_service1 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_37_get_oc_service1 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_38_check_xc1_ROADMA FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_39_check_xc1_ROADMC FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_40_create_oc_service2 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_41_get_oc_service2 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_42_check_xc2_ROADMA FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_43_check_topo_ROADMA FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_44_delete_oc_service1 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_45_delete_oc_service2 FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_46_get_no_oc_services FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_47_get_no_xc_ROADMA FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_48_check_topo_ROADMA FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_49_loop_create_oc_service FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_50_loop_create_eth_service FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_51_disconnect_XPDRA FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_52_disconnect_XPDRC FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_53_disconnect_ROADMA FAILED transportpce_tests/2.2.1/test12_end2end.py::TransportPCEFulltesting::test_54_disconnect_ROADMC 54 failed in 1474.05s (0:24:34) tests121: OK ✔ in 35 minutes 42.25 seconds tests221: exit 1 (2976.42 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 2.2.1 pid=43656 tests221: FAIL ✖ in 49 minutes 42.62 seconds tests_hybrid: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt tests_hybrid: freeze> python -m pip freeze --all tests_hybrid: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.8,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.4.1,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.2,requests==2.32.3,setuptools==72.1.0,urllib3==2.2.2,wheel==0.44.0 tests_hybrid: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh hybrid using environment variables from ./karaf121.env pytest -q transportpce_tests/hybrid/test01_device_change_notifications.py FFFFFFFFFFFF.FFF.FFF.FF.FFF.FF.FFF.FF.FFF.FFFFFFFFF [100%] =================================== FAILURES =================================== ________________ TransportPCEFulltesting.test_01_connect_xpdrA _________________ self = def test_01_connect_xpdrA(self): response = test_utils.mount_device("XPDRA01", ('xpdra', self.NODE_VERSION_121)) > self.assertEqual(response.status_code, requests.codes.created, test_utils.CODE_SHOULD_BE_201) E AssertionError: 401 != 201 : Http status code should be 201 transportpce_tests/hybrid/test01_device_change_notifications.py:97: AssertionError ---------------------------- Captured stdout setup ----------------------------- starting OpenDaylight... starting KARAF TransportPCE build... Searching for pattern 'Transportpce controller started' in karaf.log... Pattern found! OpenDaylight started ! starting simulator xpdra in OpenROADM device version 1.2.1... Searching for pattern 'Data tree change listeners registered' in xpdra-121.log... Pattern found! simulator for xpdra started starting simulator roadma in OpenROADM device version 2.2.1... Searching for pattern 'Data tree change listeners registered' in roadma-221.log... Pattern found! simulator for roadma started starting simulator roadmc in OpenROADM device version 2.2.1... Searching for pattern 'Data tree change listeners registered' in roadmc-221.log... Pattern found! simulator for roadmc started starting simulator xpdrc in OpenROADM device version 7.1... Searching for pattern 'Data tree change listeners registered' in xpdrc-71.log... Pattern found! simulator for xpdrc started ----------------------------- Captured stdout call ----------------------------- execution of test_01_connect_xpdrA Searching for pattern 'Triggering notification stream NETCONF for node XPDRA01' in karaf.log... Pattern not found after 180 seconds! Node XPDRA01 still not added to tpce topology... ________________ TransportPCEFulltesting.test_02_connect_xpdrC _________________ self = def test_02_connect_xpdrC(self): response = test_utils.mount_device("XPDR-C1", ('xpdrc', self.NODE_VERSION_71)) > self.assertEqual(response.status_code, requests.codes.created, test_utils.CODE_SHOULD_BE_201) E AssertionError: 401 != 201 : Http status code should be 201 transportpce_tests/hybrid/test01_device_change_notifications.py:102: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_02_connect_xpdrC Searching for pattern 'Triggering notification stream NETCONF for node XPDR-C1' in karaf.log... Pattern not found after 180 seconds! Node XPDR-C1 still not added to tpce topology... _________________ TransportPCEFulltesting.test_03_connect_rdmA _________________ self = def test_03_connect_rdmA(self): response = test_utils.mount_device("ROADM-A1", ('roadma', self.NODE_VERSION_221)) > self.assertEqual(response.status_code, requests.codes.created, test_utils.CODE_SHOULD_BE_201) E AssertionError: 401 != 201 : Http status code should be 201 transportpce_tests/hybrid/test01_device_change_notifications.py:107: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_03_connect_rdmA Searching for pattern 'Triggering notification stream NETCONF for node ROADM-A1' in karaf.log... Pattern not found after 180 seconds! Node ROADM-A1 still not added to tpce topology... _________________ TransportPCEFulltesting.test_04_connect_rdmC _________________ self = def test_04_connect_rdmC(self): response = test_utils.mount_device("ROADM-C1", ('roadmc', self.NODE_VERSION_221)) > self.assertEqual(response.status_code, requests.codes.created, test_utils.CODE_SHOULD_BE_201) E AssertionError: 401 != 201 : Http status code should be 201 transportpce_tests/hybrid/test01_device_change_notifications.py:112: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_04_connect_rdmC Searching for pattern 'Triggering notification stream NETCONF for node ROADM-C1' in karaf.log... Pattern not found after 180 seconds! Node ROADM-C1 still not added to tpce topology... ________ TransportPCEFulltesting.test_05_connect_xpdrA_N1_to_roadmA_PP1 ________ self = def test_05_connect_xpdrA_N1_to_roadmA_PP1(self): > response = test_utils.transportpce_api_rpc_request( 'transportpce-networkutils', 'init-xpdr-rdm-links', {'links-input': {'xpdr-node': 'XPDRA01', 'xpdr-num': '1', 'network-num': '1', 'rdm-node': 'ROADM-A1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX'}}) transportpce_tests/hybrid/test01_device_change_notifications.py:116: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'transportpce-networkutils', rpc = 'init-xpdr-rdm-links' payload = {'links-input': {'network-num': '1', 'rdm-node': 'ROADM-A1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX', ...}} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'transportpce-networkutils:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_05_connect_xpdrA_N1_to_roadmA_PP1 ________ TransportPCEFulltesting.test_06_connect_roadmA_PP1_to_xpdrA_N1 ________ self = def test_06_connect_roadmA_PP1_to_xpdrA_N1(self): > response = test_utils.transportpce_api_rpc_request( 'transportpce-networkutils', 'init-rdm-xpdr-links', {'links-input': {'xpdr-node': 'XPDRA01', 'xpdr-num': '1', 'network-num': '1', 'rdm-node': 'ROADM-A1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX'}}) transportpce_tests/hybrid/test01_device_change_notifications.py:124: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'transportpce-networkutils', rpc = 'init-rdm-xpdr-links' payload = {'links-input': {'network-num': '1', 'rdm-node': 'ROADM-A1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX', ...}} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'transportpce-networkutils:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_06_connect_roadmA_PP1_to_xpdrA_N1 ________ TransportPCEFulltesting.test_07_connect_xpdrC_N1_to_roadmC_PP1 ________ self = def test_07_connect_xpdrC_N1_to_roadmC_PP1(self): > response = test_utils.transportpce_api_rpc_request( 'transportpce-networkutils', 'init-xpdr-rdm-links', {'links-input': {'xpdr-node': 'XPDR-C1', 'xpdr-num': '1', 'network-num': '1', 'rdm-node': 'ROADM-C1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX'}}) transportpce_tests/hybrid/test01_device_change_notifications.py:132: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'transportpce-networkutils', rpc = 'init-xpdr-rdm-links' payload = {'links-input': {'network-num': '1', 'rdm-node': 'ROADM-C1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX', ...}} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'transportpce-networkutils:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_07_connect_xpdrC_N1_to_roadmC_PP1 ________ TransportPCEFulltesting.test_08_connect_roadmC_PP1_to_xpdrC_N1 ________ self = def test_08_connect_roadmC_PP1_to_xpdrC_N1(self): > response = test_utils.transportpce_api_rpc_request( 'transportpce-networkutils', 'init-rdm-xpdr-links', {'links-input': {'xpdr-node': 'XPDR-C1', 'xpdr-num': '1', 'network-num': '1', 'rdm-node': 'ROADM-C1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX'}}) transportpce_tests/hybrid/test01_device_change_notifications.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'transportpce-networkutils', rpc = 'init-rdm-xpdr-links' payload = {'links-input': {'network-num': '1', 'rdm-node': 'ROADM-C1', 'srg-num': '1', 'termination-point-num': 'SRG1-PP1-TXRX', ...}} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'transportpce-networkutils:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_08_connect_roadmC_PP1_to_xpdrC_N1 _______ TransportPCEFulltesting.test_09_add_omsAttributes_ROADMA_ROADMC ________ self = def test_09_add_omsAttributes_ROADMA_ROADMC(self): # Config ROADMA-ROADMC oms-attributes data = {"span": { "auto-spanloss": "true", "spanloss-base": 11.4, "spanloss-current": 12, "engineered-spanloss": 12.2, "link-concatenation": [{ "SRLG-Id": 0, "fiber-type": "smf", "SRLG-length": 100000, "pmd": 0.5}]}} response = test_utils.add_oms_attr_request( "ROADM-A1-DEG2-DEG2-TTP-TXRXtoROADM-C1-DEG1-DEG1-TTP-TXRX", data) > self.assertEqual(response.status_code, requests.codes.created) E AssertionError: 401 != 201 transportpce_tests/hybrid/test01_device_change_notifications.py:161: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_09_add_omsAttributes_ROADMA_ROADMC _______ TransportPCEFulltesting.test_10_add_omsAttributes_ROADMC_ROADMA ________ self = def test_10_add_omsAttributes_ROADMC_ROADMA(self): # Config ROADMC-ROADMA oms-attributes data = {"span": { "auto-spanloss": "true", "spanloss-base": 11.4, "spanloss-current": 12, "engineered-spanloss": 12.2, "link-concatenation": [{ "SRLG-Id": 0, "fiber-type": "smf", "SRLG-length": 100000, "pmd": 0.5}]}} response = test_utils.add_oms_attr_request( "ROADM-C1-DEG1-DEG1-TTP-TXRXtoROADM-A1-DEG2-DEG2-TTP-TXRX", data) > self.assertEqual(response.status_code, requests.codes.created) E AssertionError: 401 != 201 transportpce_tests/hybrid/test01_device_change_notifications.py:177: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_10_add_omsAttributes_ROADMC_ROADMA _____________ TransportPCEFulltesting.test_11_create_eth_service1 ______________ self = def test_11_create_eth_service1(self): > response = test_utils.transportpce_api_rpc_request( 'org-openroadm-service', 'service-create', self.cr_serv_input_data) transportpce_tests/hybrid/test01_device_change_notifications.py:181: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'org-openroadm-service', rpc = 'service-create' payload = {'common-id': 'ASATT1234567', 'connection-type': 'service', 'due-date': '2016-11-28T00:00:01Z', 'operator-contact': 'pw1234', ...} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'org-openroadm-service:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_11_create_eth_service1 _______________ TransportPCEFulltesting.test_12_get_eth_service1 _______________ self = def test_12_get_eth_service1(self): > response = test_utils.get_ordm_serv_list_attr_request( "services", "service1") transportpce_tests/hybrid/test01_device_change_notifications.py:190: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attribute = 'services', value = 'service1' def get_ordm_serv_list_attr_request(attribute: str, value: str): url = {'rfc8040': '{}/data/org-openroadm-service:service-list/{}={}?content=nonconfig', 'draft-bierman02': '{}/operational/org-openroadm-service:service-list/{}/{}'} format_args = ('{}', attribute, value) response = get_request(url[RESTCONF_VERSION].format(*format_args)) res = response.json() return_key = {'rfc8040': 'org-openroadm-service:' + attribute, 'draft-bierman02': attribute} if return_key[RESTCONF_VERSION] in res.keys(): response_attribute = res[return_key[RESTCONF_VERSION]] else: > response_attribute = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:636: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_12_get_eth_service1 ___________ TransportPCEFulltesting.test_14_check_update_portmapping ___________ self = def test_14_check_update_portmapping(self): > response = test_utils.get_portmapping_node_attr("XPDRA01", None, None) transportpce_tests/hybrid/test01_device_change_notifications.py:212: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ node = 'XPDRA01', attr = 'nodes', value = None def get_portmapping_node_attr(node: str, attr: str, value: str): # pylint: disable=consider-using-f-string url = {'rfc8040': '{}/data/transportpce-portmapping:network/nodes={}', 'draft-bierman02': '{}/config/transportpce-portmapping:network/nodes/{}'} target_url = url[RESTCONF_VERSION].format('{}', node) if attr is not None: target_url = (target_url + '/{}').format('{}', attr) if value is not None: suffix = {'rfc8040': '={}', 'draft-bierman02': '/{}'} target_url = (target_url + suffix[RESTCONF_VERSION]).format('{}', value) else: attr = 'nodes' response = get_request(target_url) res = response.json() return_key = {'rfc8040': 'transportpce-portmapping:' + attr, 'draft-bierman02': attr} if return_key[RESTCONF_VERSION] in res.keys(): return_output = res[return_key[RESTCONF_VERSION]] else: > return_output = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:477: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_14_check_update_portmapping _________ TransportPCEFulltesting.test_15_check_update_openroadm_topo __________ self = def test_15_check_update_openroadm_topo(self): response = test_utils.get_ietf_network_request('openroadm-topology', 'config') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/hybrid/test01_device_change_notifications.py:230: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_15_check_update_openroadm_topo ____________ TransportPCEFulltesting.test_16_check_update_service1 _____________ self = def test_16_check_update_service1(self): > response = test_utils.get_ordm_serv_list_attr_request("services", "service1") transportpce_tests/hybrid/test01_device_change_notifications.py:263: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attribute = 'services', value = 'service1' def get_ordm_serv_list_attr_request(attribute: str, value: str): url = {'rfc8040': '{}/data/org-openroadm-service:service-list/{}={}?content=nonconfig', 'draft-bierman02': '{}/operational/org-openroadm-service:service-list/{}/{}'} format_args = ('{}', attribute, value) response = get_request(url[RESTCONF_VERSION].format(*format_args)) res = response.json() return_key = {'rfc8040': 'org-openroadm-service:' + attribute, 'draft-bierman02': attribute} if return_key[RESTCONF_VERSION] in res.keys(): response_attribute = res[return_key[RESTCONF_VERSION]] else: > response_attribute = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:636: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_16_check_update_service1 _________ TransportPCEFulltesting.test_18_check_update_portmapping_ok __________ self = def test_18_check_update_portmapping_ok(self): > response = test_utils.get_portmapping_node_attr("XPDRA01", None, None) transportpce_tests/hybrid/test01_device_change_notifications.py:282: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ node = 'XPDRA01', attr = 'nodes', value = None def get_portmapping_node_attr(node: str, attr: str, value: str): # pylint: disable=consider-using-f-string url = {'rfc8040': '{}/data/transportpce-portmapping:network/nodes={}', 'draft-bierman02': '{}/config/transportpce-portmapping:network/nodes/{}'} target_url = url[RESTCONF_VERSION].format('{}', node) if attr is not None: target_url = (target_url + '/{}').format('{}', attr) if value is not None: suffix = {'rfc8040': '={}', 'draft-bierman02': '/{}'} target_url = (target_url + suffix[RESTCONF_VERSION]).format('{}', value) else: attr = 'nodes' response = get_request(target_url) res = response.json() return_key = {'rfc8040': 'transportpce-portmapping:' + attr, 'draft-bierman02': attr} if return_key[RESTCONF_VERSION] in res.keys(): return_output = res[return_key[RESTCONF_VERSION]] else: > return_output = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:477: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_18_check_update_portmapping_ok ________ TransportPCEFulltesting.test_19_check_update_openroadm_topo_ok ________ self = def test_19_check_update_openroadm_topo_ok(self): response = test_utils.get_ietf_network_request('openroadm-topology', 'config') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/hybrid/test01_device_change_notifications.py:294: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_19_check_update_openroadm_topo_ok ___________ TransportPCEFulltesting.test_20_check_update_service1_ok ___________ self = def test_20_check_update_service1_ok(self): > self.test_12_get_eth_service1() transportpce_tests/hybrid/test01_device_change_notifications.py:311: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/hybrid/test01_device_change_notifications.py:190: in test_12_get_eth_service1 response = test_utils.get_ordm_serv_list_attr_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attribute = 'services', value = 'service1' def get_ordm_serv_list_attr_request(attribute: str, value: str): url = {'rfc8040': '{}/data/org-openroadm-service:service-list/{}={}?content=nonconfig', 'draft-bierman02': '{}/operational/org-openroadm-service:service-list/{}/{}'} format_args = ('{}', attribute, value) response = get_request(url[RESTCONF_VERSION].format(*format_args)) res = response.json() return_key = {'rfc8040': 'org-openroadm-service:' + attribute, 'draft-bierman02': attribute} if return_key[RESTCONF_VERSION] in res.keys(): response_attribute = res[return_key[RESTCONF_VERSION]] else: > response_attribute = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:636: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_20_check_update_service1_ok ___________ TransportPCEFulltesting.test_22_check_update_portmapping ___________ self = def test_22_check_update_portmapping(self): > response = test_utils.get_portmapping_node_attr("ROADM-A1", None, None) transportpce_tests/hybrid/test01_device_change_notifications.py:326: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ node = 'ROADM-A1', attr = 'nodes', value = None def get_portmapping_node_attr(node: str, attr: str, value: str): # pylint: disable=consider-using-f-string url = {'rfc8040': '{}/data/transportpce-portmapping:network/nodes={}', 'draft-bierman02': '{}/config/transportpce-portmapping:network/nodes/{}'} target_url = url[RESTCONF_VERSION].format('{}', node) if attr is not None: target_url = (target_url + '/{}').format('{}', attr) if value is not None: suffix = {'rfc8040': '={}', 'draft-bierman02': '/{}'} target_url = (target_url + suffix[RESTCONF_VERSION]).format('{}', value) else: attr = 'nodes' response = get_request(target_url) res = response.json() return_key = {'rfc8040': 'transportpce-portmapping:' + attr, 'draft-bierman02': attr} if return_key[RESTCONF_VERSION] in res.keys(): return_output = res[return_key[RESTCONF_VERSION]] else: > return_output = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:477: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_22_check_update_portmapping _________ TransportPCEFulltesting.test_23_check_update_openroadm_topo __________ self = def test_23_check_update_openroadm_topo(self): response = test_utils.get_ietf_network_request('openroadm-topology', 'config') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/hybrid/test01_device_change_notifications.py:344: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_23_check_update_openroadm_topo _________ TransportPCEFulltesting.test_25_check_update_portmapping_ok __________ self = def test_25_check_update_portmapping_ok(self): > self.test_18_check_update_portmapping_ok() transportpce_tests/hybrid/test01_device_change_notifications.py:389: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/hybrid/test01_device_change_notifications.py:282: in test_18_check_update_portmapping_ok response = test_utils.get_portmapping_node_attr("XPDRA01", None, None) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ node = 'XPDRA01', attr = 'nodes', value = None def get_portmapping_node_attr(node: str, attr: str, value: str): # pylint: disable=consider-using-f-string url = {'rfc8040': '{}/data/transportpce-portmapping:network/nodes={}', 'draft-bierman02': '{}/config/transportpce-portmapping:network/nodes/{}'} target_url = url[RESTCONF_VERSION].format('{}', node) if attr is not None: target_url = (target_url + '/{}').format('{}', attr) if value is not None: suffix = {'rfc8040': '={}', 'draft-bierman02': '/{}'} target_url = (target_url + suffix[RESTCONF_VERSION]).format('{}', value) else: attr = 'nodes' response = get_request(target_url) res = response.json() return_key = {'rfc8040': 'transportpce-portmapping:' + attr, 'draft-bierman02': attr} if return_key[RESTCONF_VERSION] in res.keys(): return_output = res[return_key[RESTCONF_VERSION]] else: > return_output = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:477: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_25_check_update_portmapping_ok ________ TransportPCEFulltesting.test_26_check_update_openroadm_topo_ok ________ self = def test_26_check_update_openroadm_topo_ok(self): > self.test_19_check_update_openroadm_topo_ok() transportpce_tests/hybrid/test01_device_change_notifications.py:392: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/hybrid/test01_device_change_notifications.py:294: in test_19_check_update_openroadm_topo_ok self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 ----------------------------- Captured stdout call ----------------------------- execution of test_26_check_update_openroadm_topo_ok ___________ TransportPCEFulltesting.test_27_check_update_service1_ok ___________ self = def test_27_check_update_service1_ok(self): > self.test_12_get_eth_service1() transportpce_tests/hybrid/test01_device_change_notifications.py:395: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/hybrid/test01_device_change_notifications.py:190: in test_12_get_eth_service1 response = test_utils.get_ordm_serv_list_attr_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attribute = 'services', value = 'service1' def get_ordm_serv_list_attr_request(attribute: str, value: str): url = {'rfc8040': '{}/data/org-openroadm-service:service-list/{}={}?content=nonconfig', 'draft-bierman02': '{}/operational/org-openroadm-service:service-list/{}/{}'} format_args = ('{}', attribute, value) response = get_request(url[RESTCONF_VERSION].format(*format_args)) res = response.json() return_key = {'rfc8040': 'org-openroadm-service:' + attribute, 'draft-bierman02': attribute} if return_key[RESTCONF_VERSION] in res.keys(): response_attribute = res[return_key[RESTCONF_VERSION]] else: > response_attribute = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:636: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_27_check_update_service1_ok ___________ TransportPCEFulltesting.test_29_check_update_portmapping ___________ self = def test_29_check_update_portmapping(self): > response = test_utils.get_portmapping_node_attr("ROADM-A1", None, None) transportpce_tests/hybrid/test01_device_change_notifications.py:410: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ node = 'ROADM-A1', attr = 'nodes', value = None def get_portmapping_node_attr(node: str, attr: str, value: str): # pylint: disable=consider-using-f-string url = {'rfc8040': '{}/data/transportpce-portmapping:network/nodes={}', 'draft-bierman02': '{}/config/transportpce-portmapping:network/nodes/{}'} target_url = url[RESTCONF_VERSION].format('{}', node) if attr is not None: target_url = (target_url + '/{}').format('{}', attr) if value is not None: suffix = {'rfc8040': '={}', 'draft-bierman02': '/{}'} target_url = (target_url + suffix[RESTCONF_VERSION]).format('{}', value) else: attr = 'nodes' response = get_request(target_url) res = response.json() return_key = {'rfc8040': 'transportpce-portmapping:' + attr, 'draft-bierman02': attr} if return_key[RESTCONF_VERSION] in res.keys(): return_output = res[return_key[RESTCONF_VERSION]] else: > return_output = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:477: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_29_check_update_portmapping _________ TransportPCEFulltesting.test_30_check_update_openroadm_topo __________ self = def test_30_check_update_openroadm_topo(self): response = test_utils.get_ietf_network_request('openroadm-topology', 'config') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/hybrid/test01_device_change_notifications.py:428: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_30_check_update_openroadm_topo _________ TransportPCEFulltesting.test_32_check_update_portmapping_ok __________ self = def test_32_check_update_portmapping_ok(self): > self.test_18_check_update_portmapping_ok() transportpce_tests/hybrid/test01_device_change_notifications.py:473: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/hybrid/test01_device_change_notifications.py:282: in test_18_check_update_portmapping_ok response = test_utils.get_portmapping_node_attr("XPDRA01", None, None) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ node = 'XPDRA01', attr = 'nodes', value = None def get_portmapping_node_attr(node: str, attr: str, value: str): # pylint: disable=consider-using-f-string url = {'rfc8040': '{}/data/transportpce-portmapping:network/nodes={}', 'draft-bierman02': '{}/config/transportpce-portmapping:network/nodes/{}'} target_url = url[RESTCONF_VERSION].format('{}', node) if attr is not None: target_url = (target_url + '/{}').format('{}', attr) if value is not None: suffix = {'rfc8040': '={}', 'draft-bierman02': '/{}'} target_url = (target_url + suffix[RESTCONF_VERSION]).format('{}', value) else: attr = 'nodes' response = get_request(target_url) res = response.json() return_key = {'rfc8040': 'transportpce-portmapping:' + attr, 'draft-bierman02': attr} if return_key[RESTCONF_VERSION] in res.keys(): return_output = res[return_key[RESTCONF_VERSION]] else: > return_output = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:477: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_32_check_update_portmapping_ok ________ TransportPCEFulltesting.test_33_check_update_openroadm_topo_ok ________ self = def test_33_check_update_openroadm_topo_ok(self): > self.test_19_check_update_openroadm_topo_ok() transportpce_tests/hybrid/test01_device_change_notifications.py:476: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/hybrid/test01_device_change_notifications.py:294: in test_19_check_update_openroadm_topo_ok self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 ----------------------------- Captured stdout call ----------------------------- execution of test_33_check_update_openroadm_topo_ok ___________ TransportPCEFulltesting.test_34_check_update_service1_ok ___________ self = def test_34_check_update_service1_ok(self): > self.test_12_get_eth_service1() transportpce_tests/hybrid/test01_device_change_notifications.py:479: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/hybrid/test01_device_change_notifications.py:190: in test_12_get_eth_service1 response = test_utils.get_ordm_serv_list_attr_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attribute = 'services', value = 'service1' def get_ordm_serv_list_attr_request(attribute: str, value: str): url = {'rfc8040': '{}/data/org-openroadm-service:service-list/{}={}?content=nonconfig', 'draft-bierman02': '{}/operational/org-openroadm-service:service-list/{}/{}'} format_args = ('{}', attribute, value) response = get_request(url[RESTCONF_VERSION].format(*format_args)) res = response.json() return_key = {'rfc8040': 'org-openroadm-service:' + attribute, 'draft-bierman02': attribute} if return_key[RESTCONF_VERSION] in res.keys(): response_attribute = res[return_key[RESTCONF_VERSION]] else: > response_attribute = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:636: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_34_check_update_service1_ok ___________ TransportPCEFulltesting.test_36_check_update_portmapping ___________ self = def test_36_check_update_portmapping(self): > response = test_utils.get_portmapping_node_attr("XPDR-C1", None, None) transportpce_tests/hybrid/test01_device_change_notifications.py:492: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ node = 'XPDR-C1', attr = 'nodes', value = None def get_portmapping_node_attr(node: str, attr: str, value: str): # pylint: disable=consider-using-f-string url = {'rfc8040': '{}/data/transportpce-portmapping:network/nodes={}', 'draft-bierman02': '{}/config/transportpce-portmapping:network/nodes/{}'} target_url = url[RESTCONF_VERSION].format('{}', node) if attr is not None: target_url = (target_url + '/{}').format('{}', attr) if value is not None: suffix = {'rfc8040': '={}', 'draft-bierman02': '/{}'} target_url = (target_url + suffix[RESTCONF_VERSION]).format('{}', value) else: attr = 'nodes' response = get_request(target_url) res = response.json() return_key = {'rfc8040': 'transportpce-portmapping:' + attr, 'draft-bierman02': attr} if return_key[RESTCONF_VERSION] in res.keys(): return_output = res[return_key[RESTCONF_VERSION]] else: > return_output = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:477: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_36_check_update_portmapping _________ TransportPCEFulltesting.test_37_check_update_openroadm_topo __________ self = def test_37_check_update_openroadm_topo(self): response = test_utils.get_ietf_network_request('openroadm-topology', 'config') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/hybrid/test01_device_change_notifications.py:510: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_37_check_update_openroadm_topo _________ TransportPCEFulltesting.test_39_check_update_portmapping_ok __________ self = def test_39_check_update_portmapping_ok(self): > self.test_18_check_update_portmapping_ok() transportpce_tests/hybrid/test01_device_change_notifications.py:553: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/hybrid/test01_device_change_notifications.py:282: in test_18_check_update_portmapping_ok response = test_utils.get_portmapping_node_attr("XPDRA01", None, None) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ node = 'XPDRA01', attr = 'nodes', value = None def get_portmapping_node_attr(node: str, attr: str, value: str): # pylint: disable=consider-using-f-string url = {'rfc8040': '{}/data/transportpce-portmapping:network/nodes={}', 'draft-bierman02': '{}/config/transportpce-portmapping:network/nodes/{}'} target_url = url[RESTCONF_VERSION].format('{}', node) if attr is not None: target_url = (target_url + '/{}').format('{}', attr) if value is not None: suffix = {'rfc8040': '={}', 'draft-bierman02': '/{}'} target_url = (target_url + suffix[RESTCONF_VERSION]).format('{}', value) else: attr = 'nodes' response = get_request(target_url) res = response.json() return_key = {'rfc8040': 'transportpce-portmapping:' + attr, 'draft-bierman02': attr} if return_key[RESTCONF_VERSION] in res.keys(): return_output = res[return_key[RESTCONF_VERSION]] else: > return_output = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:477: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_39_check_update_portmapping_ok ________ TransportPCEFulltesting.test_40_check_update_openroadm_topo_ok ________ self = def test_40_check_update_openroadm_topo_ok(self): > self.test_19_check_update_openroadm_topo_ok() transportpce_tests/hybrid/test01_device_change_notifications.py:556: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/hybrid/test01_device_change_notifications.py:294: in test_19_check_update_openroadm_topo_ok self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 ----------------------------- Captured stdout call ----------------------------- execution of test_40_check_update_openroadm_topo_ok ___________ TransportPCEFulltesting.test_41_check_update_service1_ok ___________ self = def test_41_check_update_service1_ok(self): > self.test_12_get_eth_service1() transportpce_tests/hybrid/test01_device_change_notifications.py:559: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/hybrid/test01_device_change_notifications.py:190: in test_12_get_eth_service1 response = test_utils.get_ordm_serv_list_attr_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attribute = 'services', value = 'service1' def get_ordm_serv_list_attr_request(attribute: str, value: str): url = {'rfc8040': '{}/data/org-openroadm-service:service-list/{}={}?content=nonconfig', 'draft-bierman02': '{}/operational/org-openroadm-service:service-list/{}/{}'} format_args = ('{}', attribute, value) response = get_request(url[RESTCONF_VERSION].format(*format_args)) res = response.json() return_key = {'rfc8040': 'org-openroadm-service:' + attribute, 'draft-bierman02': attribute} if return_key[RESTCONF_VERSION] in res.keys(): response_attribute = res[return_key[RESTCONF_VERSION]] else: > response_attribute = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:636: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_41_check_update_service1_ok ___________ TransportPCEFulltesting.test_43_check_update_portmapping ___________ self = def test_43_check_update_portmapping(self): > response = test_utils.get_portmapping_node_attr("ROADM-A1", None, None) transportpce_tests/hybrid/test01_device_change_notifications.py:574: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ node = 'ROADM-A1', attr = 'nodes', value = None def get_portmapping_node_attr(node: str, attr: str, value: str): # pylint: disable=consider-using-f-string url = {'rfc8040': '{}/data/transportpce-portmapping:network/nodes={}', 'draft-bierman02': '{}/config/transportpce-portmapping:network/nodes/{}'} target_url = url[RESTCONF_VERSION].format('{}', node) if attr is not None: target_url = (target_url + '/{}').format('{}', attr) if value is not None: suffix = {'rfc8040': '={}', 'draft-bierman02': '/{}'} target_url = (target_url + suffix[RESTCONF_VERSION]).format('{}', value) else: attr = 'nodes' response = get_request(target_url) res = response.json() return_key = {'rfc8040': 'transportpce-portmapping:' + attr, 'draft-bierman02': attr} if return_key[RESTCONF_VERSION] in res.keys(): return_output = res[return_key[RESTCONF_VERSION]] else: > return_output = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:477: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_43_check_update_portmapping _________ TransportPCEFulltesting.test_44_check_update_openroadm_topo __________ self = def test_44_check_update_openroadm_topo(self): response = test_utils.get_ietf_network_request('openroadm-topology', 'config') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/hybrid/test01_device_change_notifications.py:592: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_44_check_update_openroadm_topo ___________ TransportPCEFulltesting.test_45_check_update_service1_ok ___________ self = def test_45_check_update_service1_ok(self): > self.test_12_get_eth_service1() transportpce_tests/hybrid/test01_device_change_notifications.py:618: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/hybrid/test01_device_change_notifications.py:190: in test_12_get_eth_service1 response = test_utils.get_ordm_serv_list_attr_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ attribute = 'services', value = 'service1' def get_ordm_serv_list_attr_request(attribute: str, value: str): url = {'rfc8040': '{}/data/org-openroadm-service:service-list/{}={}?content=nonconfig', 'draft-bierman02': '{}/operational/org-openroadm-service:service-list/{}/{}'} format_args = ('{}', attribute, value) response = get_request(url[RESTCONF_VERSION].format(*format_args)) res = response.json() return_key = {'rfc8040': 'org-openroadm-service:' + attribute, 'draft-bierman02': attribute} if return_key[RESTCONF_VERSION] in res.keys(): response_attribute = res[return_key[RESTCONF_VERSION]] else: > response_attribute = res['errors']['error'][0] E KeyError: 'errors' transportpce_tests/common/test_utils.py:636: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_45_check_update_service1_ok _____________ TransportPCEFulltesting.test_46_delete_eth_service1 ______________ self = def test_46_delete_eth_service1(self): self.del_serv_input_data["service-delete-req-info"]["service-name"] = "service1" > response = test_utils.transportpce_api_rpc_request( 'org-openroadm-service', 'service-delete', self.del_serv_input_data) transportpce_tests/hybrid/test01_device_change_notifications.py:622: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ api_module = 'org-openroadm-service', rpc = 'service-delete' payload = {'sdnc-request-header': {'notification-url': 'http://localhost:8585/NotificationServer/notify', 'request-id': 'e3028ba...ame', 'rpc-action': 'service-delete'}, 'service-delete-req-info': {'service-name': 'service1', 'tail-retention': 'no'}} def transportpce_api_rpc_request(api_module: str, rpc: str, payload: dict): # pylint: disable=consider-using-f-string url = "{}/operations/{}:{}".format('{}', api_module, rpc) if payload is None: data = None elif RESTCONF_VERSION == 'draft-bierman02': data = prepend_dict_keys({'input': payload}, api_module + ':') else: data = {'input': payload} response = post_request(url, data) if response.status_code == requests.codes.no_content: return_output = None else: res = response.json() return_key = {'rfc8040': api_module + ':output', 'draft-bierman02': 'output'} if response.status_code == requests.codes.internal_server_error: return_output = res else: > return_output = res[return_key[RESTCONF_VERSION]] E KeyError: 'org-openroadm-service:output' transportpce_tests/common/test_utils.py:694: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_46_delete_eth_service1 ________ TransportPCEFulltesting.test_47_disconnect_xponders_from_roadm ________ self = def test_47_disconnect_xponders_from_roadm(self): response = test_utils.get_ietf_network_request('openroadm-topology', 'config') > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 401 != 200 transportpce_tests/hybrid/test01_device_change_notifications.py:632: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_47_disconnect_xponders_from_roadm _______________ TransportPCEFulltesting.test_48_disconnect_XPDRA _______________ self = def test_48_disconnect_XPDRA(self): response = test_utils.unmount_device("XPDRA01") > self.assertIn(response.status_code, (requests.codes.ok, requests.codes.no_content)) E AssertionError: 401 not found in (200, 204) transportpce_tests/hybrid/test01_device_change_notifications.py:642: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_48_disconnect_XPDRA Searching for pattern 'onDeviceDisConnected:\ XPDRA01' in karaf.log... Pattern not found after 180 seconds! Node XPDRA01 still not deleted from tpce topology... _______________ TransportPCEFulltesting.test_49_disconnect_XPDRC _______________ self = def test_49_disconnect_XPDRC(self): response = test_utils.unmount_device("XPDR-C1") > self.assertIn(response.status_code, (requests.codes.ok, requests.codes.no_content)) E AssertionError: 401 not found in (200, 204) transportpce_tests/hybrid/test01_device_change_notifications.py:646: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_49_disconnect_XPDRC Searching for pattern 'onDeviceDisConnected:\ XPDR\-C1' in karaf.log... Pattern not found after 180 seconds! Node XPDR-C1 still not deleted from tpce topology... ______________ TransportPCEFulltesting.test_50_disconnect_ROADMA _______________ self = def test_50_disconnect_ROADMA(self): response = test_utils.unmount_device("ROADM-A1") > self.assertIn(response.status_code, (requests.codes.ok, requests.codes.no_content)) E AssertionError: 401 not found in (200, 204) transportpce_tests/hybrid/test01_device_change_notifications.py:650: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_50_disconnect_ROADMA Searching for pattern 'onDeviceDisConnected:\ ROADM\-A1' in karaf.log... Pattern not found after 180 seconds! Node ROADM-A1 still not deleted from tpce topology... ______________ TransportPCEFulltesting.test_51_disconnect_ROADMC _______________ self = def test_51_disconnect_ROADMC(self): response = test_utils.unmount_device("ROADM-C1") > self.assertIn(response.status_code, (requests.codes.ok, requests.codes.no_content)) E AssertionError: 401 not found in (200, 204) transportpce_tests/hybrid/test01_device_change_notifications.py:654: AssertionError ----------------------------- Captured stdout call ----------------------------- execution of test_51_disconnect_ROADMC Searching for pattern 'onDeviceDisConnected:\ ROADM\-C1' in karaf.log... Pattern not found after 180 seconds! Node ROADM-C1 still not deleted from tpce topology... --------------------------- Captured stdout teardown --------------------------- all processes killed =========================== short test summary info ============================ FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_01_connect_xpdrA FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_02_connect_xpdrC FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_03_connect_rdmA FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_04_connect_rdmC FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_05_connect_xpdrA_N1_to_roadmA_PP1 FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_06_connect_roadmA_PP1_to_xpdrA_N1 FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_07_connect_xpdrC_N1_to_roadmC_PP1 FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_08_connect_roadmC_PP1_to_xpdrC_N1 FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_09_add_omsAttributes_ROADMA_ROADMC FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_10_add_omsAttributes_ROADMC_ROADMA FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_11_create_eth_service1 FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_12_get_eth_service1 FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_14_check_update_portmapping FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_15_check_update_openroadm_topo FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_16_check_update_service1 FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_18_check_update_portmapping_ok FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_19_check_update_openroadm_topo_ok FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_20_check_update_service1_ok FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_22_check_update_portmapping FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_23_check_update_openroadm_topo FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_25_check_update_portmapping_ok FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_26_check_update_openroadm_topo_ok FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_27_check_update_service1_ok FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_29_check_update_portmapping FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_30_check_update_openroadm_topo FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_32_check_update_portmapping_ok FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_33_check_update_openroadm_topo_ok FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_34_check_update_service1_ok FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_36_check_update_portmapping FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_37_check_update_openroadm_topo FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_39_check_update_portmapping_ok FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_40_check_update_openroadm_topo_ok FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_41_check_update_service1_ok FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_43_check_update_portmapping FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_44_check_update_openroadm_topo FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_45_check_update_service1_ok FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_46_delete_eth_service1 FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_47_disconnect_xponders_from_roadm FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_48_disconnect_XPDRA FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_49_disconnect_XPDRC FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_50_disconnect_ROADMA FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TransportPCEFulltesting::test_51_disconnect_ROADMC 42 failed, 9 passed in 1504.72s (0:25:04) tests_hybrid: exit 1 (1505.00 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh hybrid pid=56425 tests_hybrid: FAIL ✖ in 25 minutes 16.77 seconds buildlighty: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt buildlighty: freeze> python -m pip freeze --all buildlighty: bcrypt==4.2.0,certifi==2024.8.30,cffi==1.17.1,charset-normalizer==3.3.2,cryptography==43.0.1,dict2xml==1.7.6,idna==3.8,iniconfig==2.0.0,lxml==5.3.0,netconf-client==3.1.1,packaging==24.1,paramiko==3.4.1,pip==24.2,pluggy==1.5.0,psutil==6.0.0,pycparser==2.22,PyNaCl==1.5.0,pytest==8.3.2,requests==2.32.3,setuptools==72.1.0,urllib3==2.2.2,wheel==0.44.0 buildlighty: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/lighty> ./build.sh [ERROR] COMPILATION ERROR : [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/module/TransportPCEImpl.java:[134,114] incompatible types: org.opendaylight.mdsal.binding.dom.adapter.CurrentAdapterSerializer cannot be converted to org.opendaylight.yangtools.binding.data.codec.spi.BindingDOMCodecServices [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/module/TransportPCEImpl.java:[248,73] incompatible types: org.opendaylight.mdsal.binding.dom.adapter.CurrentAdapterSerializer cannot be converted to org.opendaylight.yangtools.binding.data.codec.spi.BindingDOMCodecServices [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[21,75] incompatible types: inference variable E has incompatible bounds equality constraints: org.opendaylight.yangtools.yang.binding.YangModuleInfo lower bounds: org.opendaylight.yangtools.binding.meta.YangModuleInfo,org.opendaylight.yangtools.yang.binding.YangModuleInfo [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.13.0:compile (default-compile) on project tpce: Compilation failure: Compilation failure: [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/module/TransportPCEImpl.java:[134,114] incompatible types: org.opendaylight.mdsal.binding.dom.adapter.CurrentAdapterSerializer cannot be converted to org.opendaylight.yangtools.binding.data.codec.spi.BindingDOMCodecServices [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/module/TransportPCEImpl.java:[248,73] incompatible types: org.opendaylight.mdsal.binding.dom.adapter.CurrentAdapterSerializer cannot be converted to org.opendaylight.yangtools.binding.data.codec.spi.BindingDOMCodecServices [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/utils/TPCEUtils.java:[21,75] incompatible types: inference variable E has incompatible bounds [ERROR] equality constraints: org.opendaylight.yangtools.yang.binding.YangModuleInfo [ERROR] lower bounds: org.opendaylight.yangtools.binding.meta.YangModuleInfo,org.opendaylight.yangtools.yang.binding.YangModuleInfo [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException buildlighty: exit 9 (13.88 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/lighty> ./build.sh pid=57405 buildcontroller: OK (125.31=setup[8.39]+cmd[116.92] seconds) testsPCE: FAIL code 1 (193.87=setup[121.00]+cmd[72.87] seconds) sims: OK (10.83=setup[6.98]+cmd[3.85] seconds) build_karaf_tests121: OK (61.85=setup[7.00]+cmd[54.85] seconds) tests121: OK (2142.25=setup[7.61]+cmd[2134.64] seconds) build_karaf_tests221: OK (65.43=setup[6.95]+cmd[58.48] seconds) tests_tapi: OK (783.21=setup[15.83]+cmd[767.38] seconds) tests221: FAIL code 1 (2982.62=setup[6.20]+cmd[2976.42] seconds) build_karaf_tests71: OK (96.38=setup[19.07]+cmd[77.31] seconds) tests71: OK (406.11=setup[6.62]+cmd[399.49] seconds) build_karaf_tests_hybrid: OK (64.70=setup[18.01]+cmd[46.69] seconds) tests_hybrid: FAIL code 1 (1516.77=setup[11.77]+cmd[1505.00] seconds) buildlighty: FAIL code 9 (21.65=setup[7.77]+cmd[13.88] seconds) docs: OK (41.97=setup[38.48]+cmd[3.49] seconds) docs-linkcheck: OK (43.79=setup[39.31]+cmd[4.48] seconds) checkbashisms: OK (2.59=setup[1.65]+cmd[0.02,0.06,0.86] seconds) pre-commit: OK (52.51=setup[3.62]+cmd[0.01,0.00,38.29,10.60] seconds) pylint: OK (25.77=setup[5.86]+cmd[19.91] seconds) evaluation failed :( (5901.19 seconds)