Started by upstream project "autorelease-release-vanadium-mvn39-openjdk21" build number 272 originally caused by: Started by timer Running as SYSTEM [EnvInject] - Loading node environment variables. Building remotely on prd-queue-disttest-2c-1g-8349 (queue-disttest-2c-1g) in workspace /w/workspace/integration-distribution-test-vanadium [ssh-agent] Looking for ssh-agent implementation... [ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine) $ ssh-agent SSH_AUTH_SOCK=/tmp/ssh-z0eIWfoBzDi0/agent.3130 SSH_AGENT_PID=3132 [ssh-agent] Started. Running ssh-add (command line suppressed) Identity added: /w/workspace/integration-distribution-test-vanadium@tmp/private_key_11237971276457664644.key (/w/workspace/integration-distribution-test-vanadium@tmp/private_key_11237971276457664644.key) [ssh-agent] Using credentials jenkins (jenkins-ssh) No emails were triggered. provisioning config files... copy managed file [npmrc] to file:/home/jenkins/.npmrc copy managed file [pipconf] to file:/home/jenkins/.config/pip/pip.conf [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins12106841299118884638.sh ---> python-tools-install.sh Setup pyenv: * system (set by /opt/pyenv/version) * 3.8.20 (set by /opt/pyenv/version) * 3.9.20 (set by /opt/pyenv/version) 3.10.15 3.11.10 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-En9f lf-activate-venv(): INFO: Save venv in file: /tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools lf-activate-venv(): INFO: Adding /tmp/venv-En9f/bin to PATH Generating Requirements File ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. httplib2 0.30.2 requires pyparsing<4,>=3.0.4, but you have pyparsing 2.4.7 which is incompatible. Python 3.11.10 pip 26.1 from /tmp/venv-En9f/lib/python3.11/site-packages/pip (python 3.11) appdirs==1.4.4 argcomplete==3.6.3 aspy.yaml==1.3.0 attrs==26.1.0 autopage==0.6.0 beautifulsoup4==4.14.3 boto3==1.42.96 botocore==1.42.96 bs4==0.0.2 certifi==2026.4.22 cffi==2.0.0 cfgv==3.5.0 chardet==7.4.3 charset-normalizer==3.4.7 click==8.3.3 cliff==4.13.3 cmd2==3.5.1 cryptography==3.3.2 debtcollector==3.1.0 decorator==5.2.1 defusedxml==0.7.1 Deprecated==1.3.1 distlib==0.4.0 dnspython==2.8.0 docker==7.1.0 dogpile.cache==1.5.0 durationpy==0.10 email-validator==2.3.0 filelock==3.29.0 future==1.0.0 gitdb==4.0.12 GitPython==3.1.47 httplib2==0.30.2 identify==2.6.19 idna==3.13 importlib-resources==1.5.0 iso8601==2.1.0 Jinja2==3.1.6 jmespath==1.1.0 jsonpatch==1.33 jsonpointer==3.1.1 jsonschema==4.26.0 jsonschema-specifications==2025.9.1 keystoneauth1==5.13.1 kubernetes==35.0.0 lftools==0.37.22 lxml==6.1.0 markdown-it-py==4.0.0 MarkupSafe==3.0.3 mdurl==0.1.2 msgpack==1.1.2 multi_key_dict==2.0.3 munch==4.0.0 netaddr==1.3.0 niet==1.4.2 nodeenv==1.10.0 oauth2client==4.1.3 oauthlib==3.3.1 openstacksdk==4.11.0 os-service-types==1.8.2 osc-lib==4.5.0 oslo.config==10.3.0 oslo.context==6.3.0 oslo.i18n==6.7.2 oslo.log==8.1.0 oslo.serialization==5.9.1 oslo.utils==10.0.1 packaging==26.2 pbr==7.0.3 platformdirs==4.9.6 prettytable==3.17.0 psutil==7.2.2 pyasn1==0.6.3 pyasn1_modules==0.4.2 pycparser==3.0 pygerrit2==2.0.15 PyGithub==2.9.1 Pygments==2.20.0 PyJWT==2.12.1 PyNaCl==1.6.2 pyparsing==2.4.7 pyperclip==1.11.0 pyrsistent==0.20.0 python-cinderclient==9.9.0 python-dateutil==2.9.0.post0 python-discovery==1.2.2 python-heatclient==5.1.0 python-jenkins==1.8.3 python-keystoneclient==5.8.0 python-magnumclient==4.10.0 python-openstackclient==9.0.0 python-swiftclient==4.10.0 PyYAML==6.0.3 referencing==0.37.0 requests==2.33.1 requests-oauthlib==2.0.0 rfc3986==2.0.0 rich==15.0.0 rich-argparse==1.7.2 rpds-py==0.30.0 rsa==4.9.1 ruamel.yaml==0.19.1 ruamel.yaml.clib==0.2.15 s3transfer==0.16.1 simplejson==4.1.1 six==1.17.0 smmap==5.0.3 soupsieve==2.8.3 stevedore==5.7.0 tabulate==0.10.0 toml==0.10.2 tomlkit==0.14.0 tqdm==4.67.3 typing_extensions==4.15.0 urllib3==1.26.20 virtualenv==21.2.4 wcwidth==0.6.0 websocket-client==1.9.0 wrapt==2.1.2 xdg==6.0.0 xmltodict==1.0.4 yq==3.4.3 [integration-distribution-test-vanadium] $ /bin/sh /tmp/jenkins8000624905927262715.sh ---> uv-install.sh Installing uv/uvx (latest) using shell installer 2026-04-27 01:39:51 URL:https://releases.astral.sh/installers/uv/latest/uv-installer.sh [71225/71225] -> "/tmp/uv-install-0x1Nqh.sh" [1] downloading uv 0.11.7 x86_64-unknown-linux-gnu installing to /home/jenkins/.local/bin uv uvx everything's installed! To add $HOME/.local/bin to your PATH, either restart your shell or run: source $HOME/.local/bin/env (sh, bash, zsh) source $HOME/.local/bin/env.fish (fish) Adding install location to PATH ---> Validating uv/uvx install uvx 0.11.7 (x86_64-unknown-linux-gnu) Waiting for the completion of bgpcep-csit-1node-bgp-ingest-all-vanadium bgpcep-csit-1node-bgp-ingest-all-vanadium #207 started. bgpcep-csit-1node-bgp-ingest-all-vanadium #207 completed. Result was UNSTABLE Waiting for the completion of bgpcep-csit-1node-bgp-ingest-mixed-all-vanadium bgpcep-csit-1node-bgp-ingest-mixed-all-vanadium #207 started. bgpcep-csit-1node-bgp-ingest-mixed-all-vanadium #207 completed. Result was UNSTABLE Waiting for the completion of bgpcep-csit-1node-throughpcep-all-vanadium bgpcep-csit-1node-throughpcep-all-vanadium #207 started. bgpcep-csit-1node-throughpcep-all-vanadium #207 completed. Result was UNSTABLE Waiting for the completion of bgpcep-csit-1node-userfeatures-all-vanadium bgpcep-csit-1node-userfeatures-all-vanadium #207 started. bgpcep-csit-1node-userfeatures-all-vanadium #207 completed. Result was UNSTABLE Waiting for the completion of daexim-csit-1node-basic-only-vanadium daexim-csit-1node-basic-only-vanadium #207 started. daexim-csit-1node-basic-only-vanadium #207 completed. Result was UNSTABLE Waiting for the completion of daexim-csit-3node-clustering-basic-only-vanadium daexim-csit-3node-clustering-basic-only-vanadium #207 started. daexim-csit-3node-clustering-basic-only-vanadium #207 completed. Result was UNSTABLE Waiting for the completion of distribution-csit-managed-vanadium distribution-csit-managed-vanadium #207 started. distribution-csit-managed-vanadium #207 completed. Result was SUCCESS Waiting for the completion of jsonrpc-csit-1node-basic-only-vanadium jsonrpc-csit-1node-basic-only-vanadium #208 started. jsonrpc-csit-1node-basic-only-vanadium #208 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-cbench-only-vanadium openflowplugin-csit-1node-cbench-only-vanadium #207 started. openflowplugin-csit-1node-cbench-only-vanadium #207 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-flow-services-all-vanadium openflowplugin-csit-1node-flow-services-all-vanadium #207 started. openflowplugin-csit-1node-flow-services-all-vanadium #207 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-perf-bulkomatic-only-vanadium openflowplugin-csit-1node-perf-bulkomatic-only-vanadium #207 started. openflowplugin-csit-1node-perf-bulkomatic-only-vanadium #207 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-perf-stats-collection-only-vanadium openflowplugin-csit-1node-perf-stats-collection-only-vanadium #207 started. openflowplugin-csit-1node-perf-stats-collection-only-vanadium #207 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-scale-link-only-vanadium openflowplugin-csit-1node-scale-link-only-vanadium #207 started. openflowplugin-csit-1node-scale-link-only-vanadium #207 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-scale-only-vanadium openflowplugin-csit-1node-scale-only-vanadium #207 started. openflowplugin-csit-1node-scale-only-vanadium #207 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-scale-switch-only-vanadium openflowplugin-csit-1node-scale-switch-only-vanadium #207 started. openflowplugin-csit-1node-scale-switch-only-vanadium #207 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-3node-clustering-bulkomatic-only-vanadium openflowplugin-csit-3node-clustering-bulkomatic-only-vanadium #207 started. openflowplugin-csit-3node-clustering-bulkomatic-only-vanadium #207 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-3node-clustering-only-vanadium openflowplugin-csit-3node-clustering-only-vanadium #207 started. openflowplugin-csit-3node-clustering-only-vanadium #207 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-3node-clustering-perf-bulkomatic-only-vanadium openflowplugin-csit-3node-clustering-perf-bulkomatic-only-vanadium #207 started. openflowplugin-csit-3node-clustering-perf-bulkomatic-only-vanadium #207 completed. Result was UNSTABLE Waiting for the completion of ovsdb-csit-1node-upstream-southbound-all-vanadium ovsdb-csit-1node-upstream-southbound-all-vanadium #208 started. ovsdb-csit-1node-upstream-southbound-all-vanadium #208 completed. Result was UNSTABLE Waiting for the completion of ovsdb-csit-3node-upstream-clustering-only-vanadium ovsdb-csit-3node-upstream-clustering-only-vanadium #207 started. ovsdb-csit-3node-upstream-clustering-only-vanadium #207 completed. Result was UNSTABLE Build step 'Trigger/call builds on other projects' changed build result to UNSTABLE $ ssh-agent -k unset SSH_AUTH_SOCK; unset SSH_AGENT_PID; echo Agent pid 3132 killed; [ssh-agent] Stopped. [PostBuildScript] - [INFO] Executing post build scripts. [integration-distribution-test-vanadium] $ /bin/sh /tmp/jenkins15697833257790446928.sh python /tmp/tmp.MGQx2Mqe26 https://jenkins.opendaylight.org/releng/job/integration-distribution-test-vanadium/209/ /tmp/tmp.MGQx2Mqe26:26: DeprecationWarning: Call to deprecated method findAll. (Replaced by find_all) -- Deprecated since version 4.0.0. links = soup.findAll("a", { "class" : "model-link" }) [PostBuildScript] - [INFO] Executing post build scripts. [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins8029605733599767588.sh ---> sysstat.sh /tmp/jenkins8029605733599767588.sh: line 19: facter: command not found [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins3467224700706485302.sh ---> package-listing.sh ++ facter osfamily /tmp/jenkins3467224700706485302.sh: line 19: facter: command not found ++ tr '[:upper:]' '[:lower:]' + OS_FAMILY= [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins1848188748966909768.sh ---> capture-instance-metadata.sh Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/integration-distribution-test-vanadium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-En9f from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools lf-activate-venv(): INFO: Adding /tmp/venv-En9f/bin to PATH INFO: Running in OpenStack, capturing instance metadata [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins4109603555105705847.sh provisioning config files... Could not find credentials [logs] for integration-distribution-test-vanadium #209 copy managed file [jenkins-log-archives-settings] to file:/w/workspace/integration-distribution-test-vanadium@tmp/config17043510932362645686tmp Regular expression run condition: Expression=[^.*logs-s3.*], Label=[odl-logs-s3-cloudfront-index] Run condition [Regular expression match] enabling perform for step [Provide Configuration files] provisioning config files... copy managed file [jenkins-s3-log-ship] to file:/home/jenkins/.aws/credentials [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content SERVER_ID=logs [EnvInject] - Variables injected successfully. [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins3214974882414859652.sh ---> create-netrc.sh WARN: Log server credential not found. [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins10533180039208878702.sh ---> python-tools-install.sh Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/integration-distribution-test-vanadium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-En9f from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools lf-activate-venv(): INFO: Adding /tmp/venv-En9f/bin to PATH [integration-distribution-test-vanadium] $ /bin/sh /tmp/jenkins13131028194723773455.sh ---> uv-install.sh uv 0.11.7 is already installed uvx 0.11.7 (x86_64-unknown-linux-gnu) [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins7887850371298269070.sh ---> sudo-logs.sh Archiving 'sudo' log.. /tmp/jenkins7887850371298269070.sh: line 41: facter: command not found [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins9975096550784956502.sh ---> job-cost.sh INFO: Activating Python virtual environment... Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/integration-distribution-test-vanadium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-En9f from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: zipp==1.1.0 python-openstackclient urllib3~=1.26.15 lf-activate-venv(): INFO: Adding /tmp/venv-En9f/bin to PATH INFO: No stack-cost file found INFO: Instance uptime: 3963s INFO: Fetching instance metadata (attempt 1 of 3)... DEBUG: URL: http://169.254.169.254/latest/meta-data/instance-type INFO: Successfully fetched instance metadata INFO: Instance type: v3-standard-2 INFO: Retrieving pricing info for: v3-standard-2 INFO: Fetching Vexxhost pricing API (attempt 1 of 3)... DEBUG: URL: https://pricing.vexxhost.net/v1/pricing/v3-standard-2/cost?seconds=3963 INFO: Successfully fetched Vexxhost pricing API INFO: Retrieved cost: 0.11 INFO: Retrieved resource: v3-standard-2 INFO: Creating archive directory: /w/workspace/integration-distribution-test-vanadium/archives/cost INFO: Archiving costs to: /w/workspace/integration-distribution-test-vanadium/archives/cost.csv INFO: Successfully archived job cost data DEBUG: Cost data: integration-distribution-test-vanadium,209,2026-04-27 02:44:32,v3-standard-2,3963,0.11,0.00,UNSTABLE [integration-distribution-test-vanadium] $ /bin/bash -l /tmp/jenkins10151855866152831467.sh ---> logs-deploy.sh Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/integration-distribution-test-vanadium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-En9f from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools urllib3~=1.26.15 lf-activate-venv(): INFO: Adding /tmp/venv-En9f/bin to PATH WARNING: Nexus logging server not set INFO: S3 path logs/releng/vex-yul-odl-jenkins-1/integration-distribution-test-vanadium/209/ INFO: archiving logs to S3 ---> uname -a: Linux prd-queue-disttest-2c-1g-8349 6.8.0-90-generic #91-Ubuntu SMP PREEMPT_DYNAMIC Tue Nov 18 14:14:30 UTC 2025 x86_64 x86_64 x86_64 GNU/Linux ---> lscpu: Architecture: x86_64 CPU op-mode(s): 32-bit, 64-bit Address sizes: 40 bits physical, 48 bits virtual Byte Order: Little Endian CPU(s): 2 On-line CPU(s) list: 0,1 Vendor ID: AuthenticAMD Model name: AMD EPYC-Rome Processor CPU family: 23 Model: 49 Thread(s) per core: 1 Core(s) per socket: 1 Socket(s): 2 Stepping: 0 BogoMIPS: 5599.99 Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy svm cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase tsc_adjust bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 clzero xsaveerptr wbnoinvd arat npt nrip_save umip rdpid arch_capabilities Virtualization: AMD-V Hypervisor vendor: KVM Virtualization type: full L1d cache: 64 KiB (2 instances) L1i cache: 64 KiB (2 instances) L2 cache: 1 MiB (2 instances) L3 cache: 32 MiB (2 instances) NUMA node(s): 1 NUMA node0 CPU(s): 0,1 Vulnerability Gather data sampling: Not affected Vulnerability Itlb multihit: Not affected Vulnerability L1tf: Not affected Vulnerability Mds: Not affected Vulnerability Meltdown: Not affected Vulnerability Mmio stale data: Not affected Vulnerability Reg file data sampling: Not affected Vulnerability Retbleed: Mitigation; untrained return thunk; SMT disabled Vulnerability Spec rstack overflow: Vulnerable: Safe RET, no microcode Vulnerability Spec store bypass: Mitigation; Speculative Store Bypass disabled via prctl Vulnerability Spectre v1: Mitigation; usercopy/swapgs barriers and __user pointer sanitization Vulnerability Spectre v2: Mitigation; Retpolines; IBPB conditional; STIBP disabled; RSB filling; PBRSB-eIBRS Not affected; BHI Not affected Vulnerability Srbds: Not affected Vulnerability Tsx async abort: Not affected Vulnerability Vmscape: Not affected ---> nproc: 2 ---> df -h: Filesystem Size Used Avail Use% Mounted on tmpfs 795M 1.1M 794M 1% /run /dev/vda1 38G 7.8G 30G 21% / tmpfs 3.9G 0 3.9G 0% /dev/shm tmpfs 5.0M 0 5.0M 0% /run/lock /dev/vda16 881M 117M 703M 15% /boot /dev/vda15 105M 6.2M 99M 6% /boot/efi tmpfs 795M 16K 795M 1% /run/user/1001 ---> free -m: total used free shared buff/cache available Mem: 7941 726 5482 4 2017 7215 Swap: 1023 0 1023 ---> ip addr: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: ens3: mtu 1458 qdisc mq state UP group default qlen 1000 link/ether fa:16:3e:43:d6:92 brd ff:ff:ff:ff:ff:ff altname enp0s3 inet 10.30.171.89/23 metric 100 brd 10.30.171.255 scope global dynamic ens3 valid_lft 82433sec preferred_lft 82433sec inet6 fe80::f816:3eff:fe43:d692/64 scope link valid_lft forever preferred_lft forever ---> sar -b -r -n DEV: Linux 6.8.0-90-generic (prd-queue-disttest-2c-1g-8349) 04/27/26 _x86_64_ (2 CPU) 01:38:40 LINUX RESTART (2 CPU) 01:40:02 tps rtps wtps dtps bread/s bwrtn/s bdscd/s 01:50:08 2.81 0.03 2.72 0.06 2.27 680.12 1.56 02:00:01 1.65 0.01 1.63 0.02 0.05 19.36 0.27 02:10:08 1.35 0.00 1.35 0.00 0.00 15.40 0.00 02:20:08 14.37 0.01 1.44 12.93 0.16 16.69 107766.06 02:30:01 1.79 0.00 1.77 0.02 0.00 21.44 0.31 02:40:08 1.45 0.00 1.44 0.01 0.00 17.01 0.09 Average: 3.90 0.01 1.73 2.17 0.42 129.26 17930.85 01:40:02 kbmemfree kbavail kbmemused %memused kbbuffers kbcached kbcommit %commit kbactive kbinact kbdirty 01:50:08 5736836 7488108 301640 3.71 61836 1869464 567212 6.18 485840 1637304 252 02:00:01 5730540 7482328 307308 3.78 62024 1869800 567204 6.18 486992 1637820 220 02:10:08 5738112 7490376 299272 3.68 62176 1870120 567212 6.18 487844 1638300 160 02:20:08 5743984 7496764 292880 3.60 62320 1870492 567336 6.18 488680 1638788 160 02:30:01 5731332 7484632 304916 3.75 62512 1870832 622420 6.78 494476 1639316 272 02:40:08 5731840 7485768 303820 3.74 62692 1871276 567412 6.18 492240 1639816 160 Average: 5735441 7487996 301639 3.71 62260 1870331 576466 6.28 489345 1638557 204 01:40:02 IFACE rxpck/s txpck/s rxkB/s txkB/s rxcmp/s txcmp/s rxmcst/s %ifutil 01:50:08 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 01:50:08 ens3 3.10 0.12 0.45 0.08 0.00 0.00 0.00 0.00 02:00:01 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 02:00:01 ens3 0.76 0.05 0.10 0.03 0.00 0.00 0.00 0.00 02:10:08 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 02:10:08 ens3 0.40 0.02 0.05 0.01 0.00 0.00 0.00 0.00 02:20:08 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 02:20:08 ens3 0.71 0.03 2.37 0.01 0.00 0.00 0.00 0.00 02:30:01 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 02:30:01 ens3 0.53 0.04 0.99 0.01 0.00 0.00 0.00 0.00 02:40:08 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 02:40:08 ens3 0.46 0.11 0.16 0.09 0.00 0.00 0.00 0.00 Average: lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 Average: ens3 1.00 0.06 0.68 0.04 0.00 0.00 0.00 0.00 ---> sar -P ALL: Linux 6.8.0-90-generic (prd-queue-disttest-2c-1g-8349) 04/27/26 _x86_64_ (2 CPU) 01:38:40 LINUX RESTART (2 CPU) 01:40:02 CPU %user %nice %system %iowait %steal %idle 01:50:08 all 0.32 0.00 0.10 0.03 0.03 99.52 01:50:08 0 0.44 0.00 0.09 0.02 0.02 99.43 01:50:08 1 0.20 0.00 0.11 0.04 0.03 99.62 02:00:01 all 0.28 0.00 0.18 0.01 0.02 99.52 02:00:01 0 0.28 0.00 0.17 0.00 0.02 99.53 02:00:01 1 0.27 0.00 0.19 0.01 0.02 99.51 02:10:08 all 0.21 0.00 0.07 0.01 0.02 99.69 02:10:08 0 0.05 0.00 0.08 0.00 0.03 99.84 02:10:08 1 0.38 0.00 0.06 0.01 0.02 99.54 02:20:08 all 0.23 0.00 0.08 0.04 0.02 99.64 02:20:08 0 0.05 0.00 0.09 0.00 0.02 99.84 02:20:08 1 0.40 0.00 0.08 0.07 0.02 99.43 02:30:01 all 0.28 0.00 0.17 0.01 0.02 99.52 02:30:01 0 0.11 0.00 0.19 0.01 0.03 99.67 02:30:01 1 0.44 0.00 0.16 0.00 0.01 99.38 02:40:08 all 0.24 0.00 0.08 0.01 0.02 99.66 02:40:08 0 0.37 0.00 0.06 0.00 0.01 99.55 02:40:08 1 0.10 0.00 0.09 0.01 0.02 99.77 Average: all 0.26 0.00 0.11 0.02 0.02 99.59 Average: 0 0.22 0.00 0.11 0.01 0.02 99.64 Average: 1 0.30 0.00 0.12 0.02 0.02 99.54