Started by upstream project "autorelease-release-titanium-mvn39-openjdk21" build number 609 originally caused by: Started by timer Running as SYSTEM [EnvInject] - Loading node environment variables. Building remotely on prd-queue-disttest-2c-1g-8177 (queue-disttest-2c-1g) in workspace /w/workspace/integration-distribution-test-titanium [ssh-agent] Looking for ssh-agent implementation... [ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine) $ ssh-agent SSH_AUTH_SOCK=/tmp/ssh-yv58O93KNv1g/agent.3143 SSH_AGENT_PID=3145 [ssh-agent] Started. Running ssh-add (command line suppressed) Identity added: /w/workspace/integration-distribution-test-titanium@tmp/private_key_7856106965034763047.key (/w/workspace/integration-distribution-test-titanium@tmp/private_key_7856106965034763047.key) [ssh-agent] Using credentials jenkins (jenkins-ssh) No emails were triggered. provisioning config files... copy managed file [npmrc] to file:/home/jenkins/.npmrc copy managed file [pipconf] to file:/home/jenkins/.config/pip/pip.conf [integration-distribution-test-titanium] $ /bin/bash /tmp/jenkins5689188285947299902.sh ---> python-tools-install.sh Setup pyenv: * system (set by /opt/pyenv/version) * 3.8.20 (set by /opt/pyenv/version) * 3.9.20 (set by /opt/pyenv/version) 3.10.15 3.11.10 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-fQss lf-activate-venv(): INFO: Save venv in file: /tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools lf-activate-venv(): INFO: Adding /tmp/venv-fQss/bin to PATH Generating Requirements File ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. httplib2 0.30.2 requires pyparsing<4,>=3.0.4, but you have pyparsing 2.4.7 which is incompatible. Python 3.11.10 pip 26.0.1 from /tmp/venv-fQss/lib/python3.11/site-packages/pip (python 3.11) appdirs==1.4.4 argcomplete==3.6.3 aspy.yaml==1.3.0 attrs==26.1.0 autopage==0.6.0 beautifulsoup4==4.14.3 boto3==1.42.96 botocore==1.42.96 bs4==0.0.2 certifi==2026.4.22 cffi==2.0.0 cfgv==3.5.0 chardet==7.4.3 charset-normalizer==3.4.7 click==8.3.3 cliff==4.13.3 cmd2==3.5.1 cryptography==3.3.2 debtcollector==3.1.0 decorator==5.2.1 defusedxml==0.7.1 Deprecated==1.3.1 distlib==0.4.0 dnspython==2.8.0 docker==7.1.0 dogpile.cache==1.5.0 durationpy==0.10 email-validator==2.3.0 filelock==3.29.0 future==1.0.0 gitdb==4.0.12 GitPython==3.1.47 httplib2==0.30.2 identify==2.6.19 idna==3.13 importlib-resources==1.5.0 iso8601==2.1.0 Jinja2==3.1.6 jmespath==1.1.0 jsonpatch==1.33 jsonpointer==3.1.1 jsonschema==4.26.0 jsonschema-specifications==2025.9.1 keystoneauth1==5.13.1 kubernetes==35.0.0 lftools==0.37.22 lxml==6.1.0 markdown-it-py==4.0.0 MarkupSafe==3.0.3 mdurl==0.1.2 msgpack==1.1.2 multi_key_dict==2.0.3 munch==4.0.0 netaddr==1.3.0 niet==1.4.2 nodeenv==1.10.0 oauth2client==4.1.3 oauthlib==3.3.1 openstacksdk==4.11.0 os-service-types==1.8.2 osc-lib==4.5.0 oslo.config==10.3.0 oslo.context==6.3.0 oslo.i18n==6.7.2 oslo.log==8.1.0 oslo.serialization==5.9.1 oslo.utils==10.0.1 packaging==26.2 pbr==7.0.3 platformdirs==4.9.6 prettytable==3.17.0 psutil==7.2.2 pyasn1==0.6.3 pyasn1_modules==0.4.2 pycparser==3.0 pygerrit2==2.0.15 PyGithub==2.9.1 Pygments==2.20.0 PyJWT==2.12.1 PyNaCl==1.6.2 pyparsing==2.4.7 pyperclip==1.11.0 pyrsistent==0.20.0 python-cinderclient==9.9.0 python-dateutil==2.9.0.post0 python-discovery==1.2.2 python-heatclient==5.1.0 python-jenkins==1.8.3 python-keystoneclient==5.8.0 python-magnumclient==4.10.0 python-openstackclient==9.0.0 python-swiftclient==4.10.0 PyYAML==6.0.3 referencing==0.37.0 requests==2.33.1 requests-oauthlib==2.0.0 rfc3986==2.0.0 rich==15.0.0 rich-argparse==1.7.2 rpds-py==0.30.0 rsa==4.9.1 ruamel.yaml==0.19.1 ruamel.yaml.clib==0.2.15 s3transfer==0.16.1 simplejson==4.1.1 six==1.17.0 smmap==5.0.3 soupsieve==2.8.3 stevedore==5.7.0 tabulate==0.10.0 toml==0.10.2 tomlkit==0.14.0 tqdm==4.67.3 typing_extensions==4.15.0 urllib3==1.26.20 virtualenv==21.2.4 wcwidth==0.6.0 websocket-client==1.9.0 wrapt==2.1.2 xdg==6.0.0 xmltodict==1.0.4 yq==3.4.3 [integration-distribution-test-titanium] $ /bin/sh /tmp/jenkins6503852531116424340.sh ---> uv-install.sh Installing uv/uvx (latest) using shell installer 2026-04-26 02:49:57 URL:https://releases.astral.sh/installers/uv/latest/uv-installer.sh [71225/71225] -> "/tmp/uv-install-6oh3RU.sh" [1] downloading uv 0.11.7 x86_64-unknown-linux-gnu installing to /home/jenkins/.local/bin uv uvx everything's installed! To add $HOME/.local/bin to your PATH, either restart your shell or run: source $HOME/.local/bin/env (sh, bash, zsh) source $HOME/.local/bin/env.fish (fish) Adding install location to PATH ---> Validating uv/uvx install uvx 0.11.7 (x86_64-unknown-linux-gnu) Waiting for the completion of bgpcep-csit-1node-bgp-ingest-all-titanium bgpcep-csit-1node-bgp-ingest-all-titanium #700 started. bgpcep-csit-1node-bgp-ingest-all-titanium #700 completed. Result was UNSTABLE Waiting for the completion of bgpcep-csit-1node-bgp-ingest-mixed-all-titanium bgpcep-csit-1node-bgp-ingest-mixed-all-titanium #701 started. bgpcep-csit-1node-bgp-ingest-mixed-all-titanium #701 completed. Result was UNSTABLE Waiting for the completion of bgpcep-csit-1node-throughpcep-all-titanium bgpcep-csit-1node-throughpcep-all-titanium #701 started. bgpcep-csit-1node-throughpcep-all-titanium #701 completed. Result was UNSTABLE Waiting for the completion of bgpcep-csit-1node-userfeatures-all-titanium bgpcep-csit-1node-userfeatures-all-titanium #701 started. bgpcep-csit-1node-userfeatures-all-titanium #701 completed. Result was UNSTABLE Waiting for the completion of daexim-csit-1node-basic-only-titanium daexim-csit-1node-basic-only-titanium #701 started. daexim-csit-1node-basic-only-titanium #701 completed. Result was UNSTABLE Waiting for the completion of daexim-csit-3node-clustering-basic-only-titanium daexim-csit-3node-clustering-basic-only-titanium #701 started. daexim-csit-3node-clustering-basic-only-titanium #701 completed. Result was UNSTABLE Waiting for the completion of distribution-csit-managed-titanium distribution-csit-managed-titanium #675 started. distribution-csit-managed-titanium #675 completed. Result was SUCCESS Waiting for the completion of jsonrpc-csit-1node-basic-only-titanium jsonrpc-csit-1node-basic-only-titanium #701 started. jsonrpc-csit-1node-basic-only-titanium #701 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-cbench-only-titanium openflowplugin-csit-1node-cbench-only-titanium #701 started. openflowplugin-csit-1node-cbench-only-titanium #701 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-flow-services-all-titanium openflowplugin-csit-1node-flow-services-all-titanium #701 started. openflowplugin-csit-1node-flow-services-all-titanium #701 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-perf-bulkomatic-only-titanium openflowplugin-csit-1node-perf-bulkomatic-only-titanium #701 started. openflowplugin-csit-1node-perf-bulkomatic-only-titanium #701 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-perf-stats-collection-only-titanium openflowplugin-csit-1node-perf-stats-collection-only-titanium #702 started. openflowplugin-csit-1node-perf-stats-collection-only-titanium #702 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-scale-link-only-titanium openflowplugin-csit-1node-scale-link-only-titanium #701 started. openflowplugin-csit-1node-scale-link-only-titanium #701 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-scale-only-titanium openflowplugin-csit-1node-scale-only-titanium #701 started. openflowplugin-csit-1node-scale-only-titanium #701 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-scale-switch-only-titanium openflowplugin-csit-1node-scale-switch-only-titanium #701 started. openflowplugin-csit-1node-scale-switch-only-titanium #701 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-3node-clustering-bulkomatic-only-titanium openflowplugin-csit-3node-clustering-bulkomatic-only-titanium #701 started. openflowplugin-csit-3node-clustering-bulkomatic-only-titanium #701 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-3node-clustering-only-titanium openflowplugin-csit-3node-clustering-only-titanium #701 started. openflowplugin-csit-3node-clustering-only-titanium #701 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-3node-clustering-perf-bulkomatic-only-titanium openflowplugin-csit-3node-clustering-perf-bulkomatic-only-titanium #701 started. openflowplugin-csit-3node-clustering-perf-bulkomatic-only-titanium #701 completed. Result was UNSTABLE Waiting for the completion of ovsdb-csit-1node-upstream-southbound-all-titanium ovsdb-csit-1node-upstream-southbound-all-titanium #701 started. ovsdb-csit-1node-upstream-southbound-all-titanium #701 completed. Result was UNSTABLE Waiting for the completion of ovsdb-csit-3node-upstream-clustering-only-titanium ovsdb-csit-3node-upstream-clustering-only-titanium #701 started. ovsdb-csit-3node-upstream-clustering-only-titanium #701 completed. Result was UNSTABLE Build step 'Trigger/call builds on other projects' changed build result to UNSTABLE $ ssh-agent -k unset SSH_AUTH_SOCK; unset SSH_AGENT_PID; echo Agent pid 3145 killed; [ssh-agent] Stopped. [PostBuildScript] - [INFO] Executing post build scripts. [integration-distribution-test-titanium] $ /bin/sh /tmp/jenkins16156229553334800913.sh python /tmp/tmp.1FUIr0Oxo4 https://jenkins.opendaylight.org/releng/job/integration-distribution-test-titanium/714/ /tmp/tmp.1FUIr0Oxo4:26: DeprecationWarning: Call to deprecated method findAll. (Replaced by find_all) -- Deprecated since version 4.0.0. links = soup.findAll("a", { "class" : "model-link" }) [PostBuildScript] - [INFO] Executing post build scripts. [integration-distribution-test-titanium] $ /bin/bash /tmp/jenkins9941394679055100493.sh ---> sysstat.sh /tmp/jenkins9941394679055100493.sh: line 19: facter: command not found [integration-distribution-test-titanium] $ /bin/bash /tmp/jenkins13614949619830721609.sh ---> package-listing.sh ++ facter osfamily /tmp/jenkins13614949619830721609.sh: line 19: facter: command not found ++ tr '[:upper:]' '[:lower:]' + OS_FAMILY= [integration-distribution-test-titanium] $ /bin/bash /tmp/jenkins11328846144824819040.sh ---> capture-instance-metadata.sh Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/integration-distribution-test-titanium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-fQss from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools lf-activate-venv(): INFO: Adding /tmp/venv-fQss/bin to PATH INFO: Running in OpenStack, capturing instance metadata [integration-distribution-test-titanium] $ /bin/bash /tmp/jenkins10571298596172005649.sh provisioning config files... Could not find credentials [logs] for integration-distribution-test-titanium #714 copy managed file [jenkins-log-archives-settings] to file:/w/workspace/integration-distribution-test-titanium@tmp/config13196137279959522772tmp Regular expression run condition: Expression=[^.*logs-s3.*], Label=[odl-logs-s3-cloudfront-index] Run condition [Regular expression match] enabling perform for step [Provide Configuration files] provisioning config files... copy managed file [jenkins-s3-log-ship] to file:/home/jenkins/.aws/credentials [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content SERVER_ID=logs [EnvInject] - Variables injected successfully. [integration-distribution-test-titanium] $ /bin/bash /tmp/jenkins10678093469766750106.sh ---> create-netrc.sh WARN: Log server credential not found. [integration-distribution-test-titanium] $ /bin/bash /tmp/jenkins13403807678668451520.sh ---> python-tools-install.sh Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/integration-distribution-test-titanium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-fQss from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools lf-activate-venv(): INFO: Adding /tmp/venv-fQss/bin to PATH [integration-distribution-test-titanium] $ /bin/sh /tmp/jenkins10458593246032839064.sh ---> uv-install.sh uv 0.11.7 is already installed uvx 0.11.7 (x86_64-unknown-linux-gnu) [integration-distribution-test-titanium] $ /bin/bash /tmp/jenkins9740379371754247754.sh ---> sudo-logs.sh Archiving 'sudo' log.. /tmp/jenkins9740379371754247754.sh: line 41: facter: command not found [integration-distribution-test-titanium] $ /bin/bash /tmp/jenkins13788779634744886372.sh ---> job-cost.sh INFO: Activating Python virtual environment... Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/integration-distribution-test-titanium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-fQss from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: zipp==1.1.0 python-openstackclient urllib3~=1.26.15 lf-activate-venv(): INFO: Adding /tmp/venv-fQss/bin to PATH INFO: No stack-cost file found INFO: Instance uptime: 3762s INFO: Fetching instance metadata (attempt 1 of 3)... DEBUG: URL: http://169.254.169.254/latest/meta-data/instance-type INFO: Successfully fetched instance metadata INFO: Instance type: v3-standard-2 INFO: Retrieving pricing info for: v3-standard-2 INFO: Fetching Vexxhost pricing API (attempt 1 of 3)... DEBUG: URL: https://pricing.vexxhost.net/v1/pricing/v3-standard-2/cost?seconds=3762 INFO: Successfully fetched Vexxhost pricing API INFO: Retrieved cost: 0.11 INFO: Retrieved resource: v3-standard-2 INFO: Creating archive directory: /w/workspace/integration-distribution-test-titanium/archives/cost INFO: Archiving costs to: /w/workspace/integration-distribution-test-titanium/archives/cost.csv INFO: Successfully archived job cost data DEBUG: Cost data: integration-distribution-test-titanium,714,2026-04-26 03:51:14,v3-standard-2,3762,0.11,0.00,UNSTABLE [integration-distribution-test-titanium] $ /bin/bash -l /tmp/jenkins14583999079234310160.sh ---> logs-deploy.sh Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/integration-distribution-test-titanium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-fQss from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools urllib3~=1.26.15 lf-activate-venv(): INFO: Adding /tmp/venv-fQss/bin to PATH WARNING: Nexus logging server not set INFO: S3 path logs/releng/vex-yul-odl-jenkins-1/integration-distribution-test-titanium/714/ INFO: archiving logs to S3 ---> uname -a: Linux prd-queue-disttest-2c-1g-8177 6.8.0-90-generic #91-Ubuntu SMP PREEMPT_DYNAMIC Tue Nov 18 14:14:30 UTC 2025 x86_64 x86_64 x86_64 GNU/Linux ---> lscpu: Architecture: x86_64 CPU op-mode(s): 32-bit, 64-bit Address sizes: 40 bits physical, 48 bits virtual Byte Order: Little Endian CPU(s): 2 On-line CPU(s) list: 0,1 Vendor ID: AuthenticAMD Model name: AMD EPYC-Rome Processor CPU family: 23 Model: 49 Thread(s) per core: 1 Core(s) per socket: 1 Socket(s): 2 Stepping: 0 BogoMIPS: 5599.99 Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy svm cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase tsc_adjust bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 clzero xsaveerptr wbnoinvd arat npt nrip_save umip rdpid arch_capabilities Virtualization: AMD-V Hypervisor vendor: KVM Virtualization type: full L1d cache: 64 KiB (2 instances) L1i cache: 64 KiB (2 instances) L2 cache: 1 MiB (2 instances) L3 cache: 32 MiB (2 instances) NUMA node(s): 1 NUMA node0 CPU(s): 0,1 Vulnerability Gather data sampling: Not affected Vulnerability Itlb multihit: Not affected Vulnerability L1tf: Not affected Vulnerability Mds: Not affected Vulnerability Meltdown: Not affected Vulnerability Mmio stale data: Not affected Vulnerability Reg file data sampling: Not affected Vulnerability Retbleed: Mitigation; untrained return thunk; SMT disabled Vulnerability Spec rstack overflow: Vulnerable: Safe RET, no microcode Vulnerability Spec store bypass: Mitigation; Speculative Store Bypass disabled via prctl Vulnerability Spectre v1: Mitigation; usercopy/swapgs barriers and __user pointer sanitization Vulnerability Spectre v2: Mitigation; Retpolines; IBPB conditional; STIBP disabled; RSB filling; PBRSB-eIBRS Not affected; BHI Not affected Vulnerability Srbds: Not affected Vulnerability Tsx async abort: Not affected Vulnerability Vmscape: Not affected ---> nproc: 2 ---> df -h: Filesystem Size Used Avail Use% Mounted on tmpfs 795M 1.1M 794M 1% /run /dev/vda1 38G 7.8G 30G 21% / tmpfs 3.9G 0 3.9G 0% /dev/shm tmpfs 5.0M 0 5.0M 0% /run/lock /dev/vda16 881M 117M 703M 15% /boot /dev/vda15 105M 6.2M 99M 6% /boot/efi tmpfs 795M 16K 795M 1% /run/user/1001 ---> free -m: total used free shared buff/cache available Mem: 7941 715 5493 4 2016 7226 Swap: 1023 0 1023 ---> ip addr: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: ens3: mtu 1458 qdisc mq state UP group default qlen 1000 link/ether fa:16:3e:03:fd:e3 brd ff:ff:ff:ff:ff:ff altname enp0s3 inet 10.30.170.252/23 metric 100 brd 10.30.171.255 scope global dynamic ens3 valid_lft 82634sec preferred_lft 82634sec inet6 fe80::f816:3eff:fe03:fde3/64 scope link valid_lft forever preferred_lft forever ---> sar -b -r -n DEV: Linux 6.8.0-90-generic (prd-queue-disttest-2c-1g-8177) 04/26/26 _x86_64_ (2 CPU) 02:48:44 LINUX RESTART (2 CPU) 02:50:04 tps rtps wtps dtps bread/s bwrtn/s bdscd/s 03:00:04 3.16 0.02 3.05 0.08 1.24 871.40 3.60 03:10:04 1.95 0.32 1.61 0.02 39.08 74.17 0.32 03:20:04 1.38 0.00 1.38 0.00 0.01 16.39 0.00 03:30:04 1.51 0.00 1.49 0.02 0.00 18.52 0.35 03:40:04 1.33 0.00 1.33 0.00 0.00 15.85 0.00 03:50:04 1.32 0.00 1.32 0.00 0.00 16.03 0.00 Average: 1.77 0.06 1.70 0.02 6.72 168.73 0.71 02:50:04 kbmemfree kbavail kbmemused %memused kbbuffers kbcached kbcommit %commit kbactive kbinact kbdirty 03:00:04 5756932 7479004 312156 3.84 61624 1840688 560476 6.10 473092 1619736 288 03:10:04 5717256 7468176 322464 3.97 61920 1868972 762916 8.31 490856 1637256 260 03:20:04 5734896 7486324 304260 3.74 62092 1869304 763060 8.31 491480 1637756 184 03:30:04 5733032 7485004 305508 3.76 62264 1869628 763152 8.31 493456 1638256 276 03:40:04 5731800 7484220 306288 3.77 62376 1869956 763216 8.31 494744 1638696 244 03:50:04 5733148 7486164 304328 3.74 62512 1870412 763216 8.31 495640 1639160 208 Average: 5734511 7481482 309167 3.80 62131 1864827 729339 7.94 489878 1635143 243 02:50:04 IFACE rxpck/s txpck/s rxkB/s txkB/s rxcmp/s txcmp/s rxmcst/s %ifutil 03:00:04 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 03:00:04 ens3 3.04 0.24 0.59 0.18 0.00 0.00 0.00 0.00 03:10:04 lo 1.02 1.02 0.05 0.05 0.00 0.00 0.00 0.00 03:10:04 ens3 1.35 0.22 3.17 0.03 0.00 0.00 0.00 0.00 03:20:04 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 03:20:04 ens3 0.44 0.03 0.05 0.01 0.00 0.00 0.00 0.00 03:30:04 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 03:30:04 ens3 0.42 0.08 0.10 0.05 0.00 0.00 0.00 0.00 03:40:04 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 03:40:04 ens3 0.35 0.05 0.08 0.03 0.00 0.00 0.00 0.00 03:50:04 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 03:50:04 ens3 0.21 0.03 0.04 0.01 0.00 0.00 0.00 0.00 Average: lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 Average: ens3 0.97 0.11 0.67 0.05 0.00 0.00 0.00 0.00 ---> sar -P ALL: Linux 6.8.0-90-generic (prd-queue-disttest-2c-1g-8177) 04/26/26 _x86_64_ (2 CPU) 02:48:44 LINUX RESTART (2 CPU) 02:50:04 CPU %user %nice %system %iowait %steal %idle 03:00:04 all 0.34 0.00 0.21 0.13 0.03 99.28 03:00:04 0 0.21 0.00 0.25 0.23 0.04 99.27 03:00:04 1 0.46 0.00 0.18 0.04 0.03 99.29 03:10:04 all 0.36 0.00 0.12 0.06 0.03 99.42 03:10:04 0 0.49 0.00 0.12 0.11 0.03 99.25 03:10:04 1 0.23 0.00 0.12 0.02 0.04 99.59 03:20:04 all 0.22 0.00 0.07 0.01 0.03 99.68 03:20:04 0 0.11 0.00 0.09 0.01 0.03 99.77 03:20:04 1 0.33 0.00 0.05 0.01 0.03 99.59 03:30:04 all 0.25 0.00 0.16 0.01 0.04 99.55 03:30:04 0 0.21 0.00 0.17 0.01 0.04 99.57 03:30:04 1 0.29 0.00 0.15 0.01 0.04 99.52 03:40:04 all 0.24 0.00 0.06 0.01 0.04 99.66 03:40:04 0 0.04 0.00 0.08 0.01 0.05 99.83 03:40:04 1 0.44 0.00 0.04 0.01 0.03 99.48 03:50:04 all 0.26 0.00 0.06 0.01 0.04 99.64 03:50:04 0 0.04 0.00 0.09 0.00 0.05 99.82 03:50:04 1 0.48 0.00 0.03 0.01 0.03 99.45 Average: all 0.28 0.00 0.11 0.04 0.03 99.54 Average: 0 0.18 0.00 0.13 0.06 0.04 99.59 Average: 1 0.37 0.00 0.10 0.02 0.03 99.49