Started by upstream project "autorelease-release-vanadium-mvn39-openjdk21" build number 270 originally caused by: Started by timer Running as SYSTEM [EnvInject] - Loading node environment variables. Building remotely on prd-queue-disttest-2c-1g-7620 (queue-disttest-2c-1g) in workspace /w/workspace/integration-distribution-test-vanadium [ssh-agent] Looking for ssh-agent implementation... [ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine) $ ssh-agent SSH_AUTH_SOCK=/tmp/ssh-KzlJAW6BQwb1/agent.3101 SSH_AGENT_PID=3103 [ssh-agent] Started. Running ssh-add (command line suppressed) Identity added: /w/workspace/integration-distribution-test-vanadium@tmp/private_key_8125244185276228772.key (/w/workspace/integration-distribution-test-vanadium@tmp/private_key_8125244185276228772.key) [ssh-agent] Using credentials jenkins (jenkins-ssh) No emails were triggered. provisioning config files... copy managed file [npmrc] to file:/home/jenkins/.npmrc copy managed file [pipconf] to file:/home/jenkins/.config/pip/pip.conf [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins3514987853206161896.sh ---> python-tools-install.sh Setup pyenv: * system (set by /opt/pyenv/version) * 3.8.20 (set by /opt/pyenv/version) * 3.9.20 (set by /opt/pyenv/version) 3.10.15 3.11.10 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-O1aH lf-activate-venv(): INFO: Save venv in file: /tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools lf-activate-venv(): INFO: Adding /tmp/venv-O1aH/bin to PATH Generating Requirements File ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. httplib2 0.30.2 requires pyparsing<4,>=3.0.4, but you have pyparsing 2.4.7 which is incompatible. Python 3.11.10 pip 26.0.1 from /tmp/venv-O1aH/lib/python3.11/site-packages/pip (python 3.11) appdirs==1.4.4 argcomplete==3.6.3 aspy.yaml==1.3.0 attrs==26.1.0 autopage==0.6.0 beautifulsoup4==4.14.3 boto3==1.42.96 botocore==1.42.96 bs4==0.0.2 certifi==2026.4.22 cffi==2.0.0 cfgv==3.5.0 chardet==7.4.3 charset-normalizer==3.4.7 click==8.3.3 cliff==4.13.3 cmd2==3.5.1 cryptography==3.3.2 debtcollector==3.1.0 decorator==5.2.1 defusedxml==0.7.1 Deprecated==1.3.1 distlib==0.4.0 dnspython==2.8.0 docker==7.1.0 dogpile.cache==1.5.0 durationpy==0.10 email-validator==2.3.0 filelock==3.29.0 future==1.0.0 gitdb==4.0.12 GitPython==3.1.47 httplib2==0.30.2 identify==2.6.19 idna==3.13 importlib-resources==1.5.0 iso8601==2.1.0 Jinja2==3.1.6 jmespath==1.1.0 jsonpatch==1.33 jsonpointer==3.1.1 jsonschema==4.26.0 jsonschema-specifications==2025.9.1 keystoneauth1==5.13.1 kubernetes==35.0.0 lftools==0.37.22 lxml==6.1.0 markdown-it-py==4.0.0 MarkupSafe==3.0.3 mdurl==0.1.2 msgpack==1.1.2 multi_key_dict==2.0.3 munch==4.0.0 netaddr==1.3.0 niet==1.4.2 nodeenv==1.10.0 oauth2client==4.1.3 oauthlib==3.3.1 openstacksdk==4.11.0 os-service-types==1.8.2 osc-lib==4.5.0 oslo.config==10.3.0 oslo.context==6.3.0 oslo.i18n==6.7.2 oslo.log==8.1.0 oslo.serialization==5.9.1 oslo.utils==10.0.1 packaging==26.2 pbr==7.0.3 platformdirs==4.9.6 prettytable==3.17.0 psutil==7.2.2 pyasn1==0.6.3 pyasn1_modules==0.4.2 pycparser==3.0 pygerrit2==2.0.15 PyGithub==2.9.1 Pygments==2.20.0 PyJWT==2.12.1 PyNaCl==1.6.2 pyparsing==2.4.7 pyperclip==1.11.0 pyrsistent==0.20.0 python-cinderclient==9.9.0 python-dateutil==2.9.0.post0 python-discovery==1.2.2 python-heatclient==5.1.0 python-jenkins==1.8.3 python-keystoneclient==5.8.0 python-magnumclient==4.10.0 python-openstackclient==9.0.0 python-swiftclient==4.10.0 PyYAML==6.0.3 referencing==0.37.0 requests==2.33.1 requests-oauthlib==2.0.0 rfc3986==2.0.0 rich==15.0.0 rich-argparse==1.7.2 rpds-py==0.30.0 rsa==4.9.1 ruamel.yaml==0.19.1 ruamel.yaml.clib==0.2.15 s3transfer==0.16.1 simplejson==4.1.1 six==1.17.0 smmap==5.0.3 soupsieve==2.8.3 stevedore==5.7.0 tabulate==0.10.0 toml==0.10.2 tomlkit==0.14.0 tqdm==4.67.3 typing_extensions==4.15.0 urllib3==1.26.20 virtualenv==21.2.4 wcwidth==0.6.0 websocket-client==1.9.0 wrapt==2.1.2 xdg==6.0.0 xmltodict==1.0.4 yq==3.4.3 [integration-distribution-test-vanadium] $ /bin/sh /tmp/jenkins11611666377734555182.sh ---> uv-install.sh Installing uv/uvx (latest) using shell installer 2026-04-25 00:34:51 URL:https://releases.astral.sh/installers/uv/latest/uv-installer.sh [71225/71225] -> "/tmp/uv-install-45h6V9.sh" [1] downloading uv 0.11.7 x86_64-unknown-linux-gnu installing to /home/jenkins/.local/bin uv uvx everything's installed! To add $HOME/.local/bin to your PATH, either restart your shell or run: source $HOME/.local/bin/env (sh, bash, zsh) source $HOME/.local/bin/env.fish (fish) Adding install location to PATH ---> Validating uv/uvx install uvx 0.11.7 (x86_64-unknown-linux-gnu) Waiting for the completion of bgpcep-csit-1node-bgp-ingest-all-vanadium bgpcep-csit-1node-bgp-ingest-all-vanadium #203 started. bgpcep-csit-1node-bgp-ingest-all-vanadium #203 completed. Result was UNSTABLE Waiting for the completion of bgpcep-csit-1node-bgp-ingest-mixed-all-vanadium bgpcep-csit-1node-bgp-ingest-mixed-all-vanadium #203 started. bgpcep-csit-1node-bgp-ingest-mixed-all-vanadium #203 completed. Result was UNSTABLE Waiting for the completion of bgpcep-csit-1node-throughpcep-all-vanadium bgpcep-csit-1node-throughpcep-all-vanadium #203 started. bgpcep-csit-1node-throughpcep-all-vanadium #203 completed. Result was UNSTABLE Waiting for the completion of bgpcep-csit-1node-userfeatures-all-vanadium bgpcep-csit-1node-userfeatures-all-vanadium #203 started. bgpcep-csit-1node-userfeatures-all-vanadium #203 completed. Result was UNSTABLE Waiting for the completion of daexim-csit-1node-basic-only-vanadium daexim-csit-1node-basic-only-vanadium #203 started. daexim-csit-1node-basic-only-vanadium #203 completed. Result was UNSTABLE Waiting for the completion of daexim-csit-3node-clustering-basic-only-vanadium daexim-csit-3node-clustering-basic-only-vanadium #203 started. daexim-csit-3node-clustering-basic-only-vanadium #203 completed. Result was FAILURE Waiting for the completion of distribution-csit-managed-vanadium distribution-csit-managed-vanadium #203 started. distribution-csit-managed-vanadium #203 completed. Result was SUCCESS Waiting for the completion of jsonrpc-csit-1node-basic-only-vanadium jsonrpc-csit-1node-basic-only-vanadium #204 started. jsonrpc-csit-1node-basic-only-vanadium #204 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-cbench-only-vanadium openflowplugin-csit-1node-cbench-only-vanadium #203 started. openflowplugin-csit-1node-cbench-only-vanadium #203 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-flow-services-all-vanadium openflowplugin-csit-1node-flow-services-all-vanadium #203 started. openflowplugin-csit-1node-flow-services-all-vanadium #203 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-perf-bulkomatic-only-vanadium openflowplugin-csit-1node-perf-bulkomatic-only-vanadium #203 started. openflowplugin-csit-1node-perf-bulkomatic-only-vanadium #203 completed. Result was FAILURE Waiting for the completion of openflowplugin-csit-1node-perf-stats-collection-only-vanadium openflowplugin-csit-1node-perf-stats-collection-only-vanadium #203 started. openflowplugin-csit-1node-perf-stats-collection-only-vanadium #203 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-scale-link-only-vanadium openflowplugin-csit-1node-scale-link-only-vanadium #203 started. openflowplugin-csit-1node-scale-link-only-vanadium #203 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-1node-scale-only-vanadium openflowplugin-csit-1node-scale-only-vanadium #203 started. openflowplugin-csit-1node-scale-only-vanadium #203 completed. Result was FAILURE Waiting for the completion of openflowplugin-csit-1node-scale-switch-only-vanadium openflowplugin-csit-1node-scale-switch-only-vanadium #203 started. openflowplugin-csit-1node-scale-switch-only-vanadium #203 completed. Result was UNSTABLE Waiting for the completion of openflowplugin-csit-3node-clustering-bulkomatic-only-vanadium openflowplugin-csit-3node-clustering-bulkomatic-only-vanadium #203 started. openflowplugin-csit-3node-clustering-bulkomatic-only-vanadium #203 completed. Result was FAILURE Waiting for the completion of openflowplugin-csit-3node-clustering-only-vanadium openflowplugin-csit-3node-clustering-only-vanadium #203 started. openflowplugin-csit-3node-clustering-only-vanadium #203 completed. Result was FAILURE Waiting for the completion of openflowplugin-csit-3node-clustering-perf-bulkomatic-only-vanadium openflowplugin-csit-3node-clustering-perf-bulkomatic-only-vanadium #203 started. openflowplugin-csit-3node-clustering-perf-bulkomatic-only-vanadium #203 completed. Result was UNSTABLE Waiting for the completion of ovsdb-csit-1node-upstream-southbound-all-vanadium ovsdb-csit-1node-upstream-southbound-all-vanadium #204 started. ovsdb-csit-1node-upstream-southbound-all-vanadium #204 completed. Result was FAILURE Waiting for the completion of ovsdb-csit-3node-upstream-clustering-only-vanadium ovsdb-csit-3node-upstream-clustering-only-vanadium #203 started. ovsdb-csit-3node-upstream-clustering-only-vanadium #203 completed. Result was UNSTABLE Build step 'Trigger/call builds on other projects' changed build result to UNSTABLE Build step 'Trigger/call builds on other projects' marked build as failure $ ssh-agent -k unset SSH_AUTH_SOCK; unset SSH_AGENT_PID; echo Agent pid 3103 killed; [ssh-agent] Stopped. [PostBuildScript] - [INFO] Executing post build scripts. [integration-distribution-test-vanadium] $ /bin/sh /tmp/jenkins13357947410262886821.sh python /tmp/tmp.P3j7joTTel https://jenkins.opendaylight.org/releng/job/integration-distribution-test-vanadium/205/ /tmp/tmp.P3j7joTTel:26: DeprecationWarning: Call to deprecated method findAll. (Replaced by find_all) -- Deprecated since version 4.0.0. links = soup.findAll("a", { "class" : "model-link" }) [PostBuildScript] - [INFO] Executing post build scripts. [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins4478997175193457990.sh ---> sysstat.sh /tmp/jenkins4478997175193457990.sh: line 19: facter: command not found [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins430916641141624399.sh ---> package-listing.sh ++ facter osfamily /tmp/jenkins430916641141624399.sh: line 19: facter: command not found ++ tr '[:upper:]' '[:lower:]' + OS_FAMILY= [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins17257300071522302034.sh ---> capture-instance-metadata.sh Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/integration-distribution-test-vanadium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-O1aH from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools lf-activate-venv(): INFO: Adding /tmp/venv-O1aH/bin to PATH INFO: Running in OpenStack, capturing instance metadata [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins8476204489751252417.sh provisioning config files... Could not find credentials [logs] for integration-distribution-test-vanadium #205 copy managed file [jenkins-log-archives-settings] to file:/w/workspace/integration-distribution-test-vanadium@tmp/config12583152512002292010tmp Regular expression run condition: Expression=[^.*logs-s3.*], Label=[odl-logs-s3-cloudfront-index] Run condition [Regular expression match] enabling perform for step [Provide Configuration files] provisioning config files... copy managed file [jenkins-s3-log-ship] to file:/home/jenkins/.aws/credentials [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content SERVER_ID=logs [EnvInject] - Variables injected successfully. [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins12633562951655460853.sh ---> create-netrc.sh WARN: Log server credential not found. [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins15681126811086445432.sh ---> python-tools-install.sh Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/integration-distribution-test-vanadium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-O1aH from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools lf-activate-venv(): INFO: Adding /tmp/venv-O1aH/bin to PATH [integration-distribution-test-vanadium] $ /bin/sh /tmp/jenkins9621686730550618922.sh ---> uv-install.sh uv 0.11.7 is already installed uvx 0.11.7 (x86_64-unknown-linux-gnu) [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins1914864803198475797.sh ---> sudo-logs.sh Archiving 'sudo' log.. /tmp/jenkins1914864803198475797.sh: line 41: facter: command not found [integration-distribution-test-vanadium] $ /bin/bash /tmp/jenkins7717245423657421881.sh ---> job-cost.sh INFO: Activating Python virtual environment... Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/integration-distribution-test-vanadium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-O1aH from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: zipp==1.1.0 python-openstackclient urllib3~=1.26.15 lf-activate-venv(): INFO: Adding /tmp/venv-O1aH/bin to PATH INFO: No stack-cost file found INFO: Instance uptime: 4407s INFO: Fetching instance metadata (attempt 1 of 3)... DEBUG: URL: http://169.254.169.254/latest/meta-data/instance-type INFO: Successfully fetched instance metadata INFO: Instance type: v3-standard-2 INFO: Retrieving pricing info for: v3-standard-2 INFO: Fetching Vexxhost pricing API (attempt 1 of 3)... DEBUG: URL: https://pricing.vexxhost.net/v1/pricing/v3-standard-2/cost?seconds=4407 INFO: Successfully fetched Vexxhost pricing API INFO: Retrieved cost: 0.11 INFO: Retrieved resource: v3-standard-2 INFO: Creating archive directory: /w/workspace/integration-distribution-test-vanadium/archives/cost INFO: Archiving costs to: /w/workspace/integration-distribution-test-vanadium/archives/cost.csv INFO: Successfully archived job cost data DEBUG: Cost data: integration-distribution-test-vanadium,205,2026-04-25 01:46:57,v3-standard-2,4407,0.11,0.00,FAILURE [integration-distribution-test-vanadium] $ /bin/bash -l /tmp/jenkins1874127736810780233.sh ---> logs-deploy.sh Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/integration-distribution-test-vanadium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-O1aH from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools urllib3~=1.26.15 lf-activate-venv(): INFO: Adding /tmp/venv-O1aH/bin to PATH WARNING: Nexus logging server not set INFO: S3 path logs/releng/vex-yul-odl-jenkins-1/integration-distribution-test-vanadium/205/ INFO: archiving logs to S3 ---> uname -a: Linux prd-queue-disttest-2c-1g-7620 6.8.0-90-generic #91-Ubuntu SMP PREEMPT_DYNAMIC Tue Nov 18 14:14:30 UTC 2025 x86_64 x86_64 x86_64 GNU/Linux ---> lscpu: Architecture: x86_64 CPU op-mode(s): 32-bit, 64-bit Address sizes: 40 bits physical, 48 bits virtual Byte Order: Little Endian CPU(s): 2 On-line CPU(s) list: 0,1 Vendor ID: AuthenticAMD Model name: AMD EPYC-Rome Processor CPU family: 23 Model: 49 Thread(s) per core: 1 Core(s) per socket: 1 Socket(s): 2 Stepping: 0 BogoMIPS: 5599.99 Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy svm cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase tsc_adjust bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 clzero xsaveerptr wbnoinvd arat npt nrip_save umip rdpid arch_capabilities Virtualization: AMD-V Hypervisor vendor: KVM Virtualization type: full L1d cache: 64 KiB (2 instances) L1i cache: 64 KiB (2 instances) L2 cache: 1 MiB (2 instances) L3 cache: 32 MiB (2 instances) NUMA node(s): 1 NUMA node0 CPU(s): 0,1 Vulnerability Gather data sampling: Not affected Vulnerability Itlb multihit: Not affected Vulnerability L1tf: Not affected Vulnerability Mds: Not affected Vulnerability Meltdown: Not affected Vulnerability Mmio stale data: Not affected Vulnerability Reg file data sampling: Not affected Vulnerability Retbleed: Mitigation; untrained return thunk; SMT disabled Vulnerability Spec rstack overflow: Vulnerable: Safe RET, no microcode Vulnerability Spec store bypass: Mitigation; Speculative Store Bypass disabled via prctl Vulnerability Spectre v1: Mitigation; usercopy/swapgs barriers and __user pointer sanitization Vulnerability Spectre v2: Mitigation; Retpolines; IBPB conditional; STIBP disabled; RSB filling; PBRSB-eIBRS Not affected; BHI Not affected Vulnerability Srbds: Not affected Vulnerability Tsx async abort: Not affected Vulnerability Vmscape: Not affected ---> nproc: 2 ---> df -h: Filesystem Size Used Avail Use% Mounted on tmpfs 795M 1.1M 794M 1% /run /dev/vda1 38G 7.8G 30G 21% / tmpfs 3.9G 0 3.9G 0% /dev/shm tmpfs 5.0M 0 5.0M 0% /run/lock /dev/vda16 881M 117M 703M 15% /boot /dev/vda15 105M 6.2M 99M 6% /boot/efi tmpfs 795M 16K 795M 1% /run/user/1001 ---> free -m: total used free shared buff/cache available Mem: 7941 687 5525 4 2013 7254 Swap: 1023 0 1023 ---> ip addr: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: ens3: mtu 1458 qdisc mq state UP group default qlen 1000 link/ether fa:16:3e:e2:3b:07 brd ff:ff:ff:ff:ff:ff altname enp0s3 inet 10.30.171.226/23 metric 100 brd 10.30.171.255 scope global dynamic ens3 valid_lft 81968sec preferred_lft 81968sec inet6 fe80::f816:3eff:fee2:3b07/64 scope link valid_lft forever preferred_lft forever ---> sar -b -r -n DEV: Linux 6.8.0-90-generic (prd-queue-disttest-2c-1g-7620) 04/25/26 _x86_64_ (2 CPU) 00:33:42 LINUX RESTART (2 CPU) 00:40:01 tps rtps wtps dtps bread/s bwrtn/s bdscd/s 00:50:02 1.21 0.01 1.20 0.00 0.05 14.92 0.36 01:00:02 1.42 0.00 1.38 0.03 0.00 17.65 1.31 01:10:01 1.19 0.00 1.17 0.02 0.00 14.11 1.09 01:20:02 1.21 0.00 1.19 0.02 0.01 14.78 1.21 01:30:02 1.97 0.32 1.61 0.04 39.03 75.31 1.51 01:40:01 1.27 0.01 1.25 0.02 1.05 15.54 0.79 Average: 1.38 0.05 1.30 0.02 6.69 25.38 1.04 00:40:01 kbmemfree kbavail kbmemused %memused kbbuffers kbcached kbcommit %commit kbactive kbinact kbdirty 00:50:02 5798696 7517080 273752 3.37 61676 1837008 537252 5.85 464464 1615948 172 01:00:02 5798260 7517172 273636 3.36 61860 1837352 537388 5.85 464932 1616476 188 01:10:01 5806632 7526044 264764 3.26 62036 1837676 537396 5.85 464688 1616976 128 01:20:02 5805376 7525308 265496 3.26 62232 1838008 537396 5.85 464884 1617500 200 01:30:02 5750300 7499260 291156 3.58 62516 1866420 739832 8.06 481796 1635008 228 01:40:01 5758032 7507828 282688 3.48 62700 1867060 537460 5.85 476332 1635520 244 Average: 5786216 7515449 275249 3.38 62170 1847254 571121 6.22 469516 1622905 193 00:40:01 IFACE rxpck/s txpck/s rxkB/s txkB/s rxcmp/s txcmp/s rxmcst/s %ifutil 00:50:02 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 00:50:02 ens3 2.32 0.23 0.46 0.18 0.00 0.00 0.00 0.00 01:00:02 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 01:00:02 ens3 0.90 0.08 0.15 0.05 0.00 0.00 0.00 0.00 01:10:01 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 01:10:01 ens3 1.37 0.06 0.21 0.06 0.00 0.00 0.00 0.00 01:20:02 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 01:20:02 ens3 0.83 0.15 0.23 0.13 0.00 0.00 0.00 0.00 01:30:02 lo 1.02 1.02 0.05 0.05 0.00 0.00 0.00 0.00 01:30:02 ens3 0.51 0.22 3.14 0.05 0.00 0.00 0.00 0.00 01:40:01 lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 01:40:01 ens3 0.34 0.11 0.11 0.06 0.00 0.00 0.00 0.00 Average: lo 1.00 1.00 0.05 0.05 0.00 0.00 0.00 0.00 Average: ens3 1.05 0.14 0.72 0.09 0.00 0.00 0.00 0.00 ---> sar -P ALL: Linux 6.8.0-90-generic (prd-queue-disttest-2c-1g-7620) 04/25/26 _x86_64_ (2 CPU) 00:33:42 LINUX RESTART (2 CPU) 00:40:01 CPU %user %nice %system %iowait %steal %idle 00:50:02 all 0.29 0.00 0.10 0.01 0.03 99.57 00:50:02 0 0.08 0.00 0.10 0.01 0.03 99.78 00:50:02 1 0.49 0.00 0.11 0.01 0.02 99.37 01:00:02 all 0.27 0.00 0.19 0.02 0.02 99.50 01:00:02 0 0.35 0.00 0.17 0.03 0.02 99.44 01:00:02 1 0.20 0.00 0.20 0.00 0.03 99.57 01:10:01 all 0.26 0.00 0.09 0.01 0.02 99.62 01:10:01 0 0.36 0.00 0.08 0.01 0.02 99.53 01:10:01 1 0.16 0.00 0.10 0.00 0.03 99.71 01:20:02 all 0.28 0.00 0.08 0.01 0.02 99.62 01:20:02 0 0.16 0.00 0.08 0.01 0.02 99.73 01:20:02 1 0.39 0.00 0.09 0.00 0.02 99.50 01:30:02 all 0.38 0.00 0.20 0.02 0.02 99.38 01:30:02 0 0.59 0.00 0.22 0.03 0.02 99.15 01:30:02 1 0.17 0.00 0.19 0.01 0.02 99.62 01:40:01 all 0.25 0.00 0.07 0.01 0.02 99.66 01:40:01 0 0.44 0.00 0.05 0.01 0.01 99.49 01:40:01 1 0.06 0.00 0.08 0.01 0.02 99.83 Average: all 0.29 0.00 0.12 0.01 0.02 99.56 Average: 0 0.33 0.00 0.12 0.02 0.02 99.52 Average: 1 0.24 0.00 0.13 0.01 0.02 99.60