Started by upstream project "autorelease-release-chromium-mvn39-openjdk21" build number 91 originally caused by: Started by timer Running as SYSTEM [EnvInject] - Loading node environment variables. Building remotely on prd-queue-disttest-2c-1g-6407 (queue-disttest-2c-1g) in workspace /w/workspace/integration-distribution-test-vanadium [ssh-agent] Looking for ssh-agent implementation... [ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine) $ ssh-agent SSH_AUTH_SOCK=/tmp/ssh-py49rTC9dsQX/agent.3116 SSH_AGENT_PID=3118 [ssh-agent] Started. Running ssh-add (command line suppressed) Identity added: /w/workspace/integration-distribution-test-vanadium@tmp/private_key_12805481128642372591.key (/w/workspace/integration-distribution-test-vanadium@tmp/private_key_12805481128642372591.key) [ssh-agent] Using credentials jenkins (jenkins-ssh) No emails were triggered. provisioning config files... copy managed file [npmrc] to file:/home/jenkins/.npmrc copy managed file [pipconf] to file:/home/jenkins/.config/pip/pip.conf [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins3584565249664231908.sh ---> python-tools-install.sh Setup pyenv: * system (set by /opt/pyenv/version) * 3.8.20 (set by /opt/pyenv/version) * 3.9.20 (set by /opt/pyenv/version) 3.10.15 3.11.10 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-HTKX lf-activate-venv(): INFO: Save venv in file: /tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools lf-activate-venv(): INFO: Adding /tmp/venv-HTKX/bin to PATH Generating Requirements File ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. httplib2 0.30.2 requires pyparsing<4,>=3.0.4, but you have pyparsing 2.4.7 which is incompatible. Python 3.11.10 pip 26.0.1 from /tmp/venv-HTKX/lib/python3.11/site-packages/pip (python 3.11) appdirs==1.4.4 argcomplete==3.6.3 aspy.yaml==1.3.0 attrs==26.1.0 autopage==0.6.0 beautifulsoup4==4.14.3 boto3==1.42.93 botocore==1.42.93 bs4==0.0.2 certifi==2026.2.25 cffi==2.0.0 cfgv==3.5.0 chardet==7.4.3 charset-normalizer==3.4.7 click==8.3.2 cliff==4.13.3 cmd2==3.5.0 cryptography==3.3.2 debtcollector==3.1.0 decorator==5.2.1 defusedxml==0.7.1 Deprecated==1.3.1 distlib==0.4.0 dnspython==2.8.0 docker==7.1.0 dogpile.cache==1.5.0 durationpy==0.10 email-validator==2.3.0 filelock==3.29.0 future==1.0.0 gitdb==4.0.12 GitPython==3.1.46 httplib2==0.30.2 identify==2.6.19 idna==3.12 importlib-resources==1.5.0 iso8601==2.1.0 Jinja2==3.1.6 jmespath==1.1.0 jsonpatch==1.33 jsonpointer==3.1.1 jsonschema==4.26.0 jsonschema-specifications==2025.9.1 keystoneauth1==5.13.1 kubernetes==35.0.0 lftools==0.37.22 lxml==6.1.0 markdown-it-py==4.0.0 MarkupSafe==3.0.3 mdurl==0.1.2 msgpack==1.1.2 multi_key_dict==2.0.3 munch==4.0.0 netaddr==1.3.0 niet==1.4.2 nodeenv==1.10.0 oauth2client==4.1.3 oauthlib==3.3.1 openstacksdk==4.11.0 os-service-types==1.8.2 osc-lib==4.5.0 oslo.config==10.3.0 oslo.context==6.3.0 oslo.i18n==6.7.2 oslo.log==8.1.0 oslo.serialization==5.9.1 oslo.utils==10.0.1 packaging==26.1 pbr==7.0.3 platformdirs==4.9.6 prettytable==3.17.0 psutil==7.2.2 pyasn1==0.6.3 pyasn1_modules==0.4.2 pycparser==3.0 pygerrit2==2.0.15 PyGithub==2.9.1 Pygments==2.20.0 PyJWT==2.12.1 PyNaCl==1.6.2 pyparsing==2.4.7 pyperclip==1.11.0 pyrsistent==0.20.0 python-cinderclient==9.9.0 python-dateutil==2.9.0.post0 python-discovery==1.2.2 python-heatclient==5.1.0 python-jenkins==1.8.3 python-keystoneclient==5.8.0 python-magnumclient==4.10.0 python-openstackclient==9.0.0 python-swiftclient==4.10.0 PyYAML==6.0.3 referencing==0.37.0 requests==2.33.1 requests-oauthlib==2.0.0 rfc3986==2.0.0 rich==15.0.0 rich-argparse==1.7.2 rpds-py==0.30.0 rsa==4.9.1 ruamel.yaml==0.19.1 ruamel.yaml.clib==0.2.15 s3transfer==0.16.0 simplejson==4.0.1 six==1.17.0 smmap==5.0.3 soupsieve==2.8.3 stevedore==5.7.0 tabulate==0.10.0 toml==0.10.2 tomlkit==0.14.0 tqdm==4.67.3 typing_extensions==4.15.0 urllib3==1.26.20 virtualenv==21.2.4 wcwidth==0.6.0 websocket-client==1.9.0 wrapt==2.1.2 xdg==6.0.0 xmltodict==1.0.4 yq==3.4.3 [integration-distribution-test-vanadium] $ /bin/sh /tmp/jenkins806570977911099959.sh ---> uv-install.sh Installing uv/uvx (latest) using shell installer 2026-04-22 02:00:06 URL:https://releases.astral.sh/installers/uv/latest/uv-installer.sh [71225/71225] -> "/tmp/uv-install-70QMyC.sh" [1] downloading uv 0.11.7 x86_64-unknown-linux-gnu installing to /home/jenkins/.local/bin uv uvx everything's installed! To add $HOME/.local/bin to your PATH, either restart your shell or run: source $HOME/.local/bin/env (sh, bash, zsh) source $HOME/.local/bin/env.fish (fish) Adding install location to PATH ---> Validating uv/uvx install uvx 0.11.7 (x86_64-unknown-linux-gnu) Waiting for the completion of bgpcep-csit-1node-bgp-ingest-all-vanadium bgpcep-csit-1node-bgp-ingest-all-vanadium #198 started. bgpcep-csit-1node-bgp-ingest-all-vanadium #198 completed. Result was UNSTABLE Waiting for the completion of bgpcep-csit-1node-bgp-ingest-mixed-all-vanadium bgpcep-csit-1node-bgp-ingest-mixed-all-vanadium #198 started. bgpcep-csit-1node-bgp-ingest-mixed-all-vanadium #198 completed. Result was UNSTABLE Waiting for the completion of bgpcep-csit-1node-throughpcep-all-vanadium bgpcep-csit-1node-throughpcep-all-vanadium #198 started. bgpcep-csit-1node-throughpcep-all-vanadium #198 completed. Result was UNSTABLE Waiting for the completion of bgpcep-csit-1node-userfeatures-all-vanadium bgpcep-csit-1node-userfeatures-all-vanadium #198 started. bgpcep-csit-1node-userfeatures-all-vanadium #198 completed. Result was UNSTABLE Waiting for the completion of daexim-csit-1node-basic-only-vanadium daexim-csit-1node-basic-only-vanadium #198 started. daexim-csit-1node-basic-only-vanadium #198 completed. Result was UNSTABLE Waiting for the completion of daexim-csit-3node-clustering-basic-only-vanadium daexim-csit-3node-clustering-basic-only-vanadium #198 started. daexim-csit-3node-clustering-basic-only-vanadium #198 completed. Result was FAILURE Waiting for the completion of distribution-csit-managed-vanadium distribution-csit-managed-vanadium #198 started. distribution-csit-managed-vanadium #198 completed. Result was SUCCESS Waiting for the completion of jsonrpc-csit-1node-basic-only-vanadium jsonrpc-csit-1node-basic-only-vanadium #199 started. jsonrpc-csit-1node-basic-only-vanadium #199 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-cbench-only-vanadium openflowplugin-csit-1node-cbench-only-vanadium #198 started. openflowplugin-csit-1node-cbench-only-vanadium #198 completed. Result was FAILURE Waiting for the completion of openflowplugin-csit-1node-flow-services-all-vanadium openflowplugin-csit-1node-flow-services-all-vanadium #198 started. openflowplugin-csit-1node-flow-services-all-vanadium #198 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-perf-bulkomatic-only-vanadium openflowplugin-csit-1node-perf-bulkomatic-only-vanadium #198 started. openflowplugin-csit-1node-perf-bulkomatic-only-vanadium #198 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-perf-stats-collection-only-vanadium openflowplugin-csit-1node-perf-stats-collection-only-vanadium #198 started. openflowplugin-csit-1node-perf-stats-collection-only-vanadium #198 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-scale-link-only-vanadium openflowplugin-csit-1node-scale-link-only-vanadium #198 started. openflowplugin-csit-1node-scale-link-only-vanadium #198 completed. Result was FAILURE Waiting for the completion of openflowplugin-csit-1node-scale-only-vanadium openflowplugin-csit-1node-scale-only-vanadium #198 started. openflowplugin-csit-1node-scale-only-vanadium #198 completed. Result was FAILURE Waiting for the completion of openflowplugin-csit-1node-scale-switch-only-vanadium openflowplugin-csit-1node-scale-switch-only-vanadium #198 started. openflowplugin-csit-1node-scale-switch-only-vanadium #198 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-3node-clustering-bulkomatic-only-vanadium openflowplugin-csit-3node-clustering-bulkomatic-only-vanadium #198 started. openflowplugin-csit-3node-clustering-bulkomatic-only-vanadium #198 completed. Result was FAILURE Waiting for the completion of openflowplugin-csit-3node-clustering-only-vanadium openflowplugin-csit-3node-clustering-only-vanadium #198 started. openflowplugin-csit-3node-clustering-only-vanadium #198 completed. Result was FAILURE Waiting for the completion of openflowplugin-csit-3node-clustering-perf-bulkomatic-only-vanadium openflowplugin-csit-3node-clustering-perf-bulkomatic-only-vanadium #198 started. openflowplugin-csit-3node-clustering-perf-bulkomatic-only-vanadium #198 completed. Result was FAILURE Waiting for the completion of ovsdb-csit-1node-upstream-southbound-all-vanadium ovsdb-csit-1node-upstream-southbound-all-vanadium #199 started. ovsdb-csit-1node-upstream-southbound-all-vanadium #199 completed. Result was UNSTABLE Waiting for the completion of ovsdb-csit-3node-upstream-clustering-only-vanadium ovsdb-csit-3node-upstream-clustering-only-vanadium #198 started. ovsdb-csit-3node-upstream-clustering-only-vanadium #198 completed. Result was FAILURE Build step 'Trigger/call builds on other projects' changed build result to UNSTABLE Build step 'Trigger/call builds on other projects' marked build as failure $ ssh-agent -k unset SSH_AUTH_SOCK; unset SSH_AGENT_PID; echo Agent pid 3118 killed; [ssh-agent] Stopped. [PostBuildScript] - [INFO] Executing post build scripts. [integration-distribution-test-vanadium] $ /bin/sh /tmp/jenkins17727429960158130300.sh python /tmp/tmp.tqFOV6C4sU https://jenkins.opendaylight.org/releng/job/integration-distribution-test-vanadium/200/ /tmp/tmp.tqFOV6C4sU:26: DeprecationWarning: Call to deprecated method findAll. (Replaced by find_all) -- Deprecated since version 4.0.0. links = soup.findAll("a", { "class" : "model-link" }) [PostBuildScript] - [INFO] Executing post build scripts. [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins11112308288478077278.sh ---> sysstat.sh /tmp/jenkins11112308288478077278.sh: line 19: facter: command not found [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins11433656168607896833.sh ---> package-listing.sh ++ tr '[:upper:]' '[:lower:]' ++ facter osfamily /tmp/jenkins11433656168607896833.sh: line 19: facter: command not found + OS_FAMILY= [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins12888902368281416946.sh ---> capture-instance-metadata.sh Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/integration-distribution-test-vanadium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-HTKX from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools lf-activate-venv(): INFO: Adding /tmp/venv-HTKX/bin to PATH INFO: Running in OpenStack, capturing instance metadata [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins14282268957309424666.sh provisioning config files... Could not find credentials [logs] for integration-distribution-test-vanadium #200 copy managed file [jenkins-log-archives-settings] to file:/w/workspace/integration-distribution-test-vanadium@tmp/config17843932322322011258tmp Regular expression run condition: Expression=[^.*logs-s3.*], Label=[odl-logs-s3-cloudfront-index] Run condition [Regular expression match] enabling perform for step [Provide Configuration files] provisioning config files... copy managed file [jenkins-s3-log-ship] to file:/home/jenkins/.aws/credentials [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content SERVER_ID=logs [EnvInject] - Variables injected successfully. [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins179724321034939800.sh ---> create-netrc.sh WARN: Log server credential not found. [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins9248046129262481375.sh ---> python-tools-install.sh Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/integration-distribution-test-vanadium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-HTKX from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools lf-activate-venv(): INFO: Adding /tmp/venv-HTKX/bin to PATH [integration-distribution-test-vanadium] $ /bin/sh /tmp/jenkins6611930416951858977.sh ---> uv-install.sh uv 0.11.7 is already installed uvx 0.11.7 (x86_64-unknown-linux-gnu) [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins14629316214968965672.sh ---> sudo-logs.sh Archiving 'sudo' log.. /tmp/jenkins14629316214968965672.sh: line 41: facter: command not found [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins13555647918740089369.sh ---> job-cost.sh INFO: Activating Python virtual environment... Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/integration-distribution-test-vanadium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-HTKX from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: zipp==1.1.0 python-openstackclient urllib3~=1.26.15 lf-activate-venv(): INFO: Adding /tmp/venv-HTKX/bin to PATH INFO: No stack-cost file found INFO: Instance uptime: 4235s INFO: Fetching instance metadata (attempt 1 of 3)... DEBUG: URL: http://169.254.169.254/latest/meta-data/instance-type INFO: Successfully fetched instance metadata INFO: Instance type: v3-standard-2 INFO: Retrieving pricing info for: v3-standard-2 INFO: Fetching Vexxhost pricing API (attempt 1 of 3)... DEBUG: URL: https://pricing.vexxhost.net/v1/pricing/v3-standard-2/cost?seconds=4235 INFO: Successfully fetched Vexxhost pricing API INFO: Retrieved cost: 0.11 INFO: Retrieved resource: v3-standard-2 INFO: Creating archive directory: /w/workspace/integration-distribution-test-vanadium/archives/cost INFO: Archiving costs to: /w/workspace/integration-distribution-test-vanadium/archives/cost.csv INFO: Successfully archived job cost data DEBUG: Cost data: integration-distribution-test-vanadium,200,2026-04-22 03:09:17,v3-standard-2,4235,0.11,0.00,FAILURE [integration-distribution-test-vanadium] $ /bin/bash -l /tmp/jenkins12668562777298762467.sh ---> logs-deploy.sh Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/integration-distribution-test-vanadium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-HTKX from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools urllib3~=1.26.15 lf-activate-venv(): INFO: Adding /tmp/venv-HTKX/bin to PATH WARNING: Nexus logging server not set INFO: S3 path logs/releng/vex-yul-odl-jenkins-1/integration-distribution-test-vanadium/200/ INFO: archiving logs to S3 ---> uname -a: Linux prd-queue-disttest-2c-1g-6407 6.8.0-90-generic #91-Ubuntu SMP PREEMPT_DYNAMIC Tue Nov 18 14:14:30 UTC 2025 x86_64 x86_64 x86_64 GNU/Linux ---> lscpu: Architecture: x86_64 CPU op-mode(s): 32-bit, 64-bit Address sizes: 40 bits physical, 48 bits virtual Byte Order: Little Endian CPU(s): 2 On-line CPU(s) list: 0,1 Vendor ID: AuthenticAMD Model name: AMD EPYC-Rome Processor CPU family: 23 Model: 49 Thread(s) per core: 1 Core(s) per socket: 1 Socket(s): 2 Stepping: 0 BogoMIPS: 5599.99 Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy svm cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase tsc_adjust bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 clzero xsaveerptr wbnoinvd arat npt nrip_save umip rdpid arch_capabilities Virtualization: AMD-V Hypervisor vendor: KVM Virtualization type: full L1d cache: 64 KiB (2 instances) L1i cache: 64 KiB (2 instances) L2 cache: 1 MiB (2 instances) L3 cache: 32 MiB (2 instances) NUMA node(s): 1 NUMA node0 CPU(s): 0,1 Vulnerability Gather data sampling: Not affected Vulnerability Itlb multihit: Not affected Vulnerability L1tf: Not affected Vulnerability Mds: Not affected Vulnerability Meltdown: Not affected Vulnerability Mmio stale data: Not affected Vulnerability Reg file data sampling: Not affected Vulnerability Retbleed: Mitigation; untrained return thunk; SMT disabled Vulnerability Spec rstack overflow: Vulnerable: Safe RET, no microcode Vulnerability Spec store bypass: Mitigation; Speculative Store Bypass disabled via prctl Vulnerability Spectre v1: Mitigation; usercopy/swapgs barriers and __user pointer sanitization Vulnerability Spectre v2: Mitigation; Retpolines; IBPB conditional; STIBP disabled; RSB filling; PBRSB-eIBRS Not affected; BHI Not affected Vulnerability Srbds: Not affected Vulnerability Tsx async abort: Not affected Vulnerability Vmscape: Not affected ---> nproc: 2 ---> df -h: Filesystem Size Used Avail Use% Mounted on tmpfs 795M 1.1M 794M 1% /run /dev/vda1 38G 7.8G 30G 21% / tmpfs 3.9G 0 3.9G 0% /dev/shm tmpfs 5.0M 0 5.0M 0% /run/lock /dev/vda16 881M 117M 703M 15% /boot /dev/vda15 105M 6.2M 99M 6% /boot/efi tmpfs 795M 16K 795M 1% /run/user/1001 ---> free -m: total used free shared buff/cache available Mem: 7941 702 5511 4 2011 7239 Swap: 1023 0 1023 ---> ip addr: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: ens3: mtu 1458 qdisc mq state UP group default qlen 1000 link/ether fa:16:3e:15:04:1a brd ff:ff:ff:ff:ff:ff altname enp0s3 inet 10.30.170.159/23 metric 100 brd 10.30.171.255 scope global dynamic ens3 valid_lft 82160sec preferred_lft 82160sec inet6 fe80::f816:3eff:fe15:41a/64 scope link valid_lft forever preferred_lft forever ---> sar -b -r -n DEV: Linux 6.8.0-90-generic (prd-queue-disttest-2c-1g-6407) 04/22/26 _x86_64_ (2 CPU) 01:58:55 LINUX RESTART (2 CPU) 02:00:06 tps rtps wtps dtps bread/s bwrtn/s bdscd/s 02:10:20 2.83 0.03 2.74 0.06 1.37 670.83 3.65 02:20:01 2.03 0.34 1.68 0.01 40.38 76.16 0.33 02:30:20 1.55 0.00 1.54 0.01 0.00 17.74 0.22 02:40:20 1.40 0.00 1.38 0.02 0.00 15.97 1.20 02:50:00 1.32 0.00 1.31 0.02 0.00 15.52 1.24 03:00:20 1.59 0.00 1.55 0.03 0.00 18.50 1.38 Average: 1.79 0.06 1.70 0.03 6.72 137.57 1.35 02:00:06 kbmemfree kbavail kbmemused %memused kbbuffers kbcached kbcommit %commit kbactive kbinact kbdirty 02:10:20 5771432 7489212 301436 3.71 61508 1836764 537996 5.86 463836 1615436 132 02:20:01 5743192 7489824 300252 3.69 61824 1865028 740232 8.06 480960 1632952 184 02:30:20 5736588 7483764 306112 3.76 61992 1865356 740232 8.06 480908 1633448 168 02:40:20 5759044 7506736 283172 3.48 62180 1865676 740376 8.06 480952 1633956 168 02:50:00 5758036 7506232 283668 3.49 62344 1866012 740376 8.06 481076 1634456 168 03:00:20 5756116 7504948 284936 3.50 62512 1866464 740376 8.06 481152 1634948 168 Average: 5754068 7496786 293263 3.61 62060 1860883 706598 7.70 478147 1630866 165 02:00:06 IFACE rxpck/s txpck/s rxkB/s txkB/s rxcmp/s txcmp/s rxmcst/s %ifutil 02:10:20 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 02:10:20 ens3 5.17 0.34 0.90 0.26 0.00 0.00 0.00 0.00 02:20:01 lo 1.02 1.02 0.05 0.05 0.00 0.00 0.00 0.00 02:20:01 ens3 1.19 0.25 3.24 0.03 0.00 0.00 0.00 0.00 02:30:20 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 02:30:20 ens3 0.33 0.02 0.05 0.01 0.00 0.00 0.00 0.00 02:40:20 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 02:40:20 ens3 0.40 0.05 0.08 0.03 0.00 0.00 0.00 0.00 02:50:00 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 02:50:00 ens3 0.21 0.04 0.04 0.02 0.00 0.00 0.00 0.00 03:00:20 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 03:00:20 ens3 0.21 0.06 0.06 0.03 0.00 0.00 0.00 0.00 Average: lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 Average: ens3 1.27 0.13 0.71 0.06 0.00 0.00 0.00 0.00 ---> sar -P ALL: Linux 6.8.0-90-generic (prd-queue-disttest-2c-1g-6407) 04/22/26 _x86_64_ (2 CPU) 01:58:55 LINUX RESTART (2 CPU) 02:00:06 CPU %user %nice %system %iowait %steal %idle 02:10:20 all 0.33 0.00 0.14 0.06 0.02 99.44 02:10:20 0 0.22 0.00 0.16 0.09 0.02 99.51 02:10:20 1 0.45 0.00 0.13 0.03 0.02 99.38 02:20:01 all 0.36 0.00 0.13 0.03 0.02 99.47 02:20:01 0 0.22 0.00 0.12 0.02 0.02 99.62 02:20:01 1 0.49 0.00 0.14 0.04 0.02 99.31 02:30:20 all 0.29 0.00 0.16 0.03 0.02 99.51 02:30:20 0 0.26 0.00 0.15 0.01 0.01 99.57 02:30:20 1 0.31 0.00 0.17 0.04 0.02 99.46 02:40:20 all 0.26 0.00 0.07 0.04 0.01 99.61 02:40:20 0 0.14 0.00 0.08 0.03 0.02 99.73 02:40:20 1 0.39 0.00 0.05 0.05 0.01 99.49 02:50:00 all 0.25 0.00 0.07 0.04 0.01 99.63 02:50:00 0 0.05 0.00 0.09 0.04 0.02 99.80 02:50:00 1 0.44 0.00 0.05 0.04 0.01 99.46 03:00:20 all 0.29 0.00 0.16 0.01 0.02 99.52 03:00:20 0 0.08 0.00 0.18 0.02 0.02 99.70 03:00:20 1 0.50 0.00 0.15 0.00 0.01 99.34 Average: all 0.30 0.00 0.12 0.03 0.02 99.53 Average: 0 0.16 0.00 0.13 0.04 0.02 99.65 Average: 1 0.43 0.00 0.12 0.03 0.01 99.41