Started by upstream project "autorelease-release-chromium-mvn39-openjdk21" build number 92 originally caused by: Started by timer Running as SYSTEM [EnvInject] - Loading node environment variables. Building remotely on prd-queue-disttest-2c-1g-6901 (queue-disttest-2c-1g) in workspace /w/workspace/integration-distribution-test-vanadium [ssh-agent] Looking for ssh-agent implementation... [ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine) $ ssh-agent SSH_AUTH_SOCK=/tmp/ssh-EYoSeYE1QaLe/agent.3124 SSH_AGENT_PID=3126 [ssh-agent] Started. Running ssh-add (command line suppressed) Identity added: /w/workspace/integration-distribution-test-vanadium@tmp/private_key_17859819266301108802.key (/w/workspace/integration-distribution-test-vanadium@tmp/private_key_17859819266301108802.key) [ssh-agent] Using credentials jenkins (jenkins-ssh) No emails were triggered. provisioning config files... copy managed file [npmrc] to file:/home/jenkins/.npmrc copy managed file [pipconf] to file:/home/jenkins/.config/pip/pip.conf [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins13912851213818497599.sh ---> python-tools-install.sh Setup pyenv: * system (set by /opt/pyenv/version) * 3.8.20 (set by /opt/pyenv/version) * 3.9.20 (set by /opt/pyenv/version) 3.10.15 3.11.10 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-DYL3 lf-activate-venv(): INFO: Save venv in file: /tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools lf-activate-venv(): INFO: Adding /tmp/venv-DYL3/bin to PATH Generating Requirements File ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. httplib2 0.30.2 requires pyparsing<4,>=3.0.4, but you have pyparsing 2.4.7 which is incompatible. Python 3.11.10 pip 26.0.1 from /tmp/venv-DYL3/lib/python3.11/site-packages/pip (python 3.11) appdirs==1.4.4 argcomplete==3.6.3 aspy.yaml==1.3.0 attrs==26.1.0 autopage==0.6.0 beautifulsoup4==4.14.3 boto3==1.42.94 botocore==1.42.94 bs4==0.0.2 certifi==2026.4.22 cffi==2.0.0 cfgv==3.5.0 chardet==7.4.3 charset-normalizer==3.4.7 click==8.3.3 cliff==4.13.3 cmd2==3.5.0 cryptography==3.3.2 debtcollector==3.1.0 decorator==5.2.1 defusedxml==0.7.1 Deprecated==1.3.1 distlib==0.4.0 dnspython==2.8.0 docker==7.1.0 dogpile.cache==1.5.0 durationpy==0.10 email-validator==2.3.0 filelock==3.29.0 future==1.0.0 gitdb==4.0.12 GitPython==3.1.47 httplib2==0.30.2 identify==2.6.19 idna==3.13 importlib-resources==1.5.0 iso8601==2.1.0 Jinja2==3.1.6 jmespath==1.1.0 jsonpatch==1.33 jsonpointer==3.1.1 jsonschema==4.26.0 jsonschema-specifications==2025.9.1 keystoneauth1==5.13.1 kubernetes==35.0.0 lftools==0.37.22 lxml==6.1.0 markdown-it-py==4.0.0 MarkupSafe==3.0.3 mdurl==0.1.2 msgpack==1.1.2 multi_key_dict==2.0.3 munch==4.0.0 netaddr==1.3.0 niet==1.4.2 nodeenv==1.10.0 oauth2client==4.1.3 oauthlib==3.3.1 openstacksdk==4.11.0 os-service-types==1.8.2 osc-lib==4.5.0 oslo.config==10.3.0 oslo.context==6.3.0 oslo.i18n==6.7.2 oslo.log==8.1.0 oslo.serialization==5.9.1 oslo.utils==10.0.1 packaging==26.1 pbr==7.0.3 platformdirs==4.9.6 prettytable==3.17.0 psutil==7.2.2 pyasn1==0.6.3 pyasn1_modules==0.4.2 pycparser==3.0 pygerrit2==2.0.15 PyGithub==2.9.1 Pygments==2.20.0 PyJWT==2.12.1 PyNaCl==1.6.2 pyparsing==2.4.7 pyperclip==1.11.0 pyrsistent==0.20.0 python-cinderclient==9.9.0 python-dateutil==2.9.0.post0 python-discovery==1.2.2 python-heatclient==5.1.0 python-jenkins==1.8.3 python-keystoneclient==5.8.0 python-magnumclient==4.10.0 python-openstackclient==9.0.0 python-swiftclient==4.10.0 PyYAML==6.0.3 referencing==0.37.0 requests==2.33.1 requests-oauthlib==2.0.0 rfc3986==2.0.0 rich==15.0.0 rich-argparse==1.7.2 rpds-py==0.30.0 rsa==4.9.1 ruamel.yaml==0.19.1 ruamel.yaml.clib==0.2.15 s3transfer==0.16.1 simplejson==4.1.0 six==1.17.0 smmap==5.0.3 soupsieve==2.8.3 stevedore==5.7.0 tabulate==0.10.0 toml==0.10.2 tomlkit==0.14.0 tqdm==4.67.3 typing_extensions==4.15.0 urllib3==1.26.20 virtualenv==21.2.4 wcwidth==0.6.0 websocket-client==1.9.0 wrapt==2.1.2 xdg==6.0.0 xmltodict==1.0.4 yq==3.4.3 [integration-distribution-test-vanadium] $ /bin/sh /tmp/jenkins6436962458090598276.sh ---> uv-install.sh Installing uv/uvx (latest) using shell installer 2026-04-23 02:50:14 URL:https://releases.astral.sh/installers/uv/latest/uv-installer.sh [71225/71225] -> "/tmp/uv-install-OjmFVA.sh" [1] downloading uv 0.11.7 x86_64-unknown-linux-gnu installing to /home/jenkins/.local/bin uv uvx everything's installed! To add $HOME/.local/bin to your PATH, either restart your shell or run: source $HOME/.local/bin/env (sh, bash, zsh) source $HOME/.local/bin/env.fish (fish) Adding install location to PATH ---> Validating uv/uvx install uvx 0.11.7 (x86_64-unknown-linux-gnu) Waiting for the completion of bgpcep-csit-1node-bgp-ingest-all-vanadium bgpcep-csit-1node-bgp-ingest-all-vanadium #200 started. bgpcep-csit-1node-bgp-ingest-all-vanadium #200 completed. Result was UNSTABLE Waiting for the completion of bgpcep-csit-1node-bgp-ingest-mixed-all-vanadium bgpcep-csit-1node-bgp-ingest-mixed-all-vanadium #200 started. bgpcep-csit-1node-bgp-ingest-mixed-all-vanadium #200 completed. Result was UNSTABLE Waiting for the completion of bgpcep-csit-1node-throughpcep-all-vanadium bgpcep-csit-1node-throughpcep-all-vanadium #200 started. bgpcep-csit-1node-throughpcep-all-vanadium #200 completed. Result was UNSTABLE Waiting for the completion of bgpcep-csit-1node-userfeatures-all-vanadium bgpcep-csit-1node-userfeatures-all-vanadium #200 started. bgpcep-csit-1node-userfeatures-all-vanadium #200 completed. Result was UNSTABLE Waiting for the completion of daexim-csit-1node-basic-only-vanadium daexim-csit-1node-basic-only-vanadium #200 started. daexim-csit-1node-basic-only-vanadium #200 completed. Result was UNSTABLE Waiting for the completion of daexim-csit-3node-clustering-basic-only-vanadium daexim-csit-3node-clustering-basic-only-vanadium #200 started. daexim-csit-3node-clustering-basic-only-vanadium #200 completed. Result was UNSTABLE Waiting for the completion of distribution-csit-managed-vanadium distribution-csit-managed-vanadium #200 started. distribution-csit-managed-vanadium #200 completed. Result was SUCCESS Waiting for the completion of jsonrpc-csit-1node-basic-only-vanadium jsonrpc-csit-1node-basic-only-vanadium #201 started. jsonrpc-csit-1node-basic-only-vanadium #201 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-cbench-only-vanadium openflowplugin-csit-1node-cbench-only-vanadium #200 started. openflowplugin-csit-1node-cbench-only-vanadium #200 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-flow-services-all-vanadium openflowplugin-csit-1node-flow-services-all-vanadium #200 started. openflowplugin-csit-1node-flow-services-all-vanadium #200 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-perf-bulkomatic-only-vanadium openflowplugin-csit-1node-perf-bulkomatic-only-vanadium #200 started. openflowplugin-csit-1node-perf-bulkomatic-only-vanadium #200 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-perf-stats-collection-only-vanadium openflowplugin-csit-1node-perf-stats-collection-only-vanadium #200 started. openflowplugin-csit-1node-perf-stats-collection-only-vanadium #200 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-scale-link-only-vanadium openflowplugin-csit-1node-scale-link-only-vanadium #200 started. openflowplugin-csit-1node-scale-link-only-vanadium #200 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-scale-only-vanadium openflowplugin-csit-1node-scale-only-vanadium #200 started. openflowplugin-csit-1node-scale-only-vanadium #200 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-scale-switch-only-vanadium openflowplugin-csit-1node-scale-switch-only-vanadium #200 started. openflowplugin-csit-1node-scale-switch-only-vanadium #200 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-3node-clustering-bulkomatic-only-vanadium openflowplugin-csit-3node-clustering-bulkomatic-only-vanadium #200 started. openflowplugin-csit-3node-clustering-bulkomatic-only-vanadium #200 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-3node-clustering-only-vanadium openflowplugin-csit-3node-clustering-only-vanadium #200 started. openflowplugin-csit-3node-clustering-only-vanadium #200 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-3node-clustering-perf-bulkomatic-only-vanadium openflowplugin-csit-3node-clustering-perf-bulkomatic-only-vanadium #200 started. openflowplugin-csit-3node-clustering-perf-bulkomatic-only-vanadium #200 completed. Result was UNSTABLE Waiting for the completion of ovsdb-csit-1node-upstream-southbound-all-vanadium ovsdb-csit-1node-upstream-southbound-all-vanadium #201 started. ovsdb-csit-1node-upstream-southbound-all-vanadium #201 completed. Result was UNSTABLE Waiting for the completion of ovsdb-csit-3node-upstream-clustering-only-vanadium ovsdb-csit-3node-upstream-clustering-only-vanadium #200 started. ovsdb-csit-3node-upstream-clustering-only-vanadium #200 completed. Result was UNSTABLE Build step 'Trigger/call builds on other projects' changed build result to UNSTABLE $ ssh-agent -k unset SSH_AUTH_SOCK; unset SSH_AGENT_PID; echo Agent pid 3126 killed; [ssh-agent] Stopped. [PostBuildScript] - [INFO] Executing post build scripts. [integration-distribution-test-vanadium] $ /bin/sh /tmp/jenkins14925989396475727976.sh python /tmp/tmp.f77umHEXZK https://jenkins.opendaylight.org/releng/job/integration-distribution-test-vanadium/202/ /tmp/tmp.f77umHEXZK:26: DeprecationWarning: Call to deprecated method findAll. (Replaced by find_all) -- Deprecated since version 4.0.0. links = soup.findAll("a", { "class" : "model-link" }) [PostBuildScript] - [INFO] Executing post build scripts. [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins6395169936663725357.sh ---> sysstat.sh /tmp/jenkins6395169936663725357.sh: line 19: facter: command not found [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins8052626567294726914.sh ---> package-listing.sh ++ tr '[:upper:]' '[:lower:]' ++ facter osfamily /tmp/jenkins8052626567294726914.sh: line 19: facter: command not found + OS_FAMILY= [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins4465823393787056351.sh ---> capture-instance-metadata.sh Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/integration-distribution-test-vanadium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-DYL3 from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools lf-activate-venv(): INFO: Adding /tmp/venv-DYL3/bin to PATH INFO: Running in OpenStack, capturing instance metadata [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins13356354806665556378.sh provisioning config files... Could not find credentials [logs] for integration-distribution-test-vanadium #202 copy managed file [jenkins-log-archives-settings] to file:/w/workspace/integration-distribution-test-vanadium@tmp/config4085156143234565855tmp Regular expression run condition: Expression=[^.*logs-s3.*], Label=[odl-logs-s3-cloudfront-index] Run condition [Regular expression match] enabling perform for step [Provide Configuration files] provisioning config files... copy managed file [jenkins-s3-log-ship] to file:/home/jenkins/.aws/credentials [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content SERVER_ID=logs [EnvInject] - Variables injected successfully. [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins7445743812738456464.sh ---> create-netrc.sh WARN: Log server credential not found. [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins13348657301172684171.sh ---> python-tools-install.sh Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/integration-distribution-test-vanadium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-DYL3 from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools lf-activate-venv(): INFO: Adding /tmp/venv-DYL3/bin to PATH [integration-distribution-test-vanadium] $ /bin/sh /tmp/jenkins12510061255800027070.sh ---> uv-install.sh uv 0.11.7 is already installed uvx 0.11.7 (x86_64-unknown-linux-gnu) [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins6174173544763310034.sh ---> sudo-logs.sh Archiving 'sudo' log.. /tmp/jenkins6174173544763310034.sh: line 41: facter: command not found [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins10489423514095233714.sh ---> job-cost.sh INFO: Activating Python virtual environment... Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/integration-distribution-test-vanadium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-DYL3 from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: zipp==1.1.0 python-openstackclient urllib3~=1.26.15 lf-activate-venv(): INFO: Adding /tmp/venv-DYL3/bin to PATH INFO: No stack-cost file found INFO: Instance uptime: 4212s INFO: Fetching instance metadata (attempt 1 of 3)... DEBUG: URL: http://169.254.169.254/latest/meta-data/instance-type INFO: Successfully fetched instance metadata INFO: Instance type: v3-standard-2 INFO: Retrieving pricing info for: v3-standard-2 INFO: Fetching Vexxhost pricing API (attempt 1 of 3)... DEBUG: URL: https://pricing.vexxhost.net/v1/pricing/v3-standard-2/cost?seconds=4212 INFO: Successfully fetched Vexxhost pricing API INFO: Retrieved cost: 0.11 INFO: Retrieved resource: v3-standard-2 INFO: Creating archive directory: /w/workspace/integration-distribution-test-vanadium/archives/cost INFO: Archiving costs to: /w/workspace/integration-distribution-test-vanadium/archives/cost.csv INFO: Successfully archived job cost data DEBUG: Cost data: integration-distribution-test-vanadium,202,2026-04-23 03:59:01,v3-standard-2,4212,0.11,0.00,UNSTABLE [integration-distribution-test-vanadium] $ /bin/bash -l /tmp/jenkins16011875102521886012.sh ---> logs-deploy.sh Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/integration-distribution-test-vanadium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-DYL3 from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools urllib3~=1.26.15 lf-activate-venv(): INFO: Adding /tmp/venv-DYL3/bin to PATH WARNING: Nexus logging server not set INFO: S3 path logs/releng/vex-yul-odl-jenkins-1/integration-distribution-test-vanadium/202/ INFO: archiving logs to S3 ---> uname -a: Linux prd-queue-disttest-2c-1g-6901 6.8.0-90-generic #91-Ubuntu SMP PREEMPT_DYNAMIC Tue Nov 18 14:14:30 UTC 2025 x86_64 x86_64 x86_64 GNU/Linux ---> lscpu: Architecture: x86_64 CPU op-mode(s): 32-bit, 64-bit Address sizes: 40 bits physical, 48 bits virtual Byte Order: Little Endian CPU(s): 2 On-line CPU(s) list: 0,1 Vendor ID: AuthenticAMD Model name: AMD EPYC-Rome Processor CPU family: 23 Model: 49 Thread(s) per core: 1 Core(s) per socket: 1 Socket(s): 2 Stepping: 0 BogoMIPS: 5600.00 Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy svm cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase tsc_adjust bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 clzero xsaveerptr wbnoinvd arat npt nrip_save umip rdpid arch_capabilities Virtualization: AMD-V Hypervisor vendor: KVM Virtualization type: full L1d cache: 64 KiB (2 instances) L1i cache: 64 KiB (2 instances) L2 cache: 1 MiB (2 instances) L3 cache: 32 MiB (2 instances) NUMA node(s): 1 NUMA node0 CPU(s): 0,1 Vulnerability Gather data sampling: Not affected Vulnerability Itlb multihit: Not affected Vulnerability L1tf: Not affected Vulnerability Mds: Not affected Vulnerability Meltdown: Not affected Vulnerability Mmio stale data: Not affected Vulnerability Reg file data sampling: Not affected Vulnerability Retbleed: Mitigation; untrained return thunk; SMT disabled Vulnerability Spec rstack overflow: Vulnerable: Safe RET, no microcode Vulnerability Spec store bypass: Mitigation; Speculative Store Bypass disabled via prctl Vulnerability Spectre v1: Mitigation; usercopy/swapgs barriers and __user pointer sanitization Vulnerability Spectre v2: Mitigation; Retpolines; IBPB conditional; STIBP disabled; RSB filling; PBRSB-eIBRS Not affected; BHI Not affected Vulnerability Srbds: Not affected Vulnerability Tsx async abort: Not affected Vulnerability Vmscape: Not affected ---> nproc: 2 ---> df -h: Filesystem Size Used Avail Use% Mounted on tmpfs 795M 1.1M 794M 1% /run /dev/vda1 38G 7.8G 30G 21% / tmpfs 3.9G 0 3.9G 0% /dev/shm tmpfs 5.0M 0 5.0M 0% /run/lock /dev/vda16 881M 117M 703M 15% /boot /dev/vda15 105M 6.2M 99M 6% /boot/efi tmpfs 795M 16K 795M 1% /run/user/1001 ---> free -m: total used free shared buff/cache available Mem: 7941 705 5504 4 2016 7236 Swap: 1023 0 1023 ---> ip addr: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: ens3: mtu 1458 qdisc mq state UP group default qlen 1000 link/ether fa:16:3e:15:3a:6c brd ff:ff:ff:ff:ff:ff altname enp0s3 inet 10.30.171.131/23 metric 100 brd 10.30.171.255 scope global dynamic ens3 valid_lft 82184sec preferred_lft 82184sec inet6 fe80::f816:3eff:fe15:3a6c/64 scope link valid_lft forever preferred_lft forever ---> sar -b -r -n DEV: Linux 6.8.0-90-generic (prd-queue-disttest-2c-1g-6901) 04/23/26 _x86_64_ (2 CPU) 02:49:02 LINUX RESTART (2 CPU) 02:50:02 tps rtps wtps dtps bread/s bwrtn/s bdscd/s 03:00:48 4.85 0.06 4.57 0.22 6.17 1103.91 3.56 03:10:01 1.20 0.01 1.18 0.01 0.06 14.19 0.78 03:20:48 1.17 0.00 1.15 0.02 0.01 13.84 1.13 03:30:48 1.46 0.00 1.42 0.04 0.00 17.72 1.32 03:40:01 1.24 0.00 1.22 0.02 0.00 14.76 1.17 03:50:48 1.89 0.30 1.55 0.03 37.14 69.80 1.35 Average: 2.01 0.07 1.89 0.06 7.70 217.61 1.58 02:50:02 kbmemfree kbavail kbmemused %memused kbbuffers kbcached kbcommit %commit kbactive kbinact kbdirty 03:00:48 5790480 7512912 277796 3.42 61664 1840956 534668 5.82 466752 1619488 156 03:10:01 5788720 7511608 278884 3.43 61832 1841256 534804 5.83 466928 1619944 152 03:20:48 5788196 7511648 278888 3.43 62028 1841612 534892 5.83 466912 1620508 124 03:30:48 5774612 7498588 291936 3.59 62224 1841936 534964 5.83 467144 1621028 124 03:40:01 5773856 7498276 292248 3.59 62376 1842228 534888 5.83 467240 1621472 124 03:50:48 5733064 7486848 303288 3.73 62692 1870976 534964 5.83 478760 1639032 124 Average: 5774821 7503313 287173 3.53 62136 1846494 534863 5.83 468956 1623579 134 02:50:02 IFACE rxpck/s txpck/s rxkB/s txkB/s rxcmp/s txcmp/s rxmcst/s %ifutil 03:00:48 lo 1.02 1.02 0.05 0.05 0.00 0.00 0.00 0.00 03:00:48 ens3 5.49 2.11 41.77 0.42 0.00 0.00 0.00 0.00 03:10:01 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 03:10:01 ens3 0.89 0.04 0.08 0.02 0.00 0.00 0.00 0.00 03:20:48 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 03:20:48 ens3 0.73 0.02 0.08 0.01 0.00 0.00 0.00 0.00 03:30:48 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 03:30:48 ens3 0.90 0.06 0.14 0.05 0.00 0.00 0.00 0.00 03:40:01 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 03:40:01 ens3 0.37 0.06 0.09 0.04 0.00 0.00 0.00 0.00 03:50:48 lo 1.02 1.02 0.05 0.05 0.00 0.00 0.00 0.00 03:50:48 ens3 0.68 0.23 2.89 0.03 0.00 0.00 0.00 0.00 Average: lo 1.01 1.01 0.05 0.05 0.00 0.00 0.00 0.00 Average: ens3 1.56 0.45 7.97 0.10 0.00 0.00 0.00 0.00 ---> sar -P ALL: Linux 6.8.0-90-generic (prd-queue-disttest-2c-1g-6901) 04/23/26 _x86_64_ (2 CPU) 02:49:02 LINUX RESTART (2 CPU) 02:50:02 CPU %user %nice %system %iowait %steal %idle 03:00:48 all 1.04 0.00 0.32 0.06 0.03 98.55 03:00:48 0 1.34 0.00 0.40 0.07 0.04 98.15 03:00:48 1 0.73 0.00 0.24 0.05 0.03 98.94 03:10:01 all 0.24 0.00 0.09 0.01 0.03 99.63 03:10:01 0 0.15 0.00 0.09 0.01 0.03 99.73 03:10:01 1 0.33 0.00 0.10 0.01 0.03 99.54 03:20:48 all 0.25 0.00 0.07 0.01 0.02 99.65 03:20:48 0 0.05 0.00 0.07 0.00 0.03 99.85 03:20:48 1 0.45 0.00 0.07 0.01 0.02 99.45 03:30:48 all 0.29 0.00 0.19 0.01 0.02 99.50 03:30:48 0 0.34 0.00 0.18 0.01 0.02 99.46 03:30:48 1 0.24 0.00 0.20 0.01 0.02 99.54 03:40:01 all 0.23 0.00 0.07 0.06 0.09 99.54 03:40:01 0 0.33 0.00 0.07 0.12 0.01 99.47 03:40:01 1 0.14 0.00 0.08 0.00 0.16 99.61 03:50:48 all 0.37 0.00 0.13 0.02 0.01 99.46 03:50:48 0 0.29 0.00 0.15 0.02 0.01 99.53 03:50:48 1 0.45 0.00 0.12 0.02 0.02 99.39 Average: all 0.41 0.00 0.15 0.03 0.03 99.38 Average: 0 0.42 0.00 0.16 0.04 0.02 99.35 Average: 1 0.40 0.00 0.13 0.02 0.04 99.40